Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:12504/feed
Articles
Vol. 16, Issue 1, 2023October 12, 2023 EDT

A Low-cost Method to Try to Improve Panel Survey Representation

Paul J. Lavrakas, Sebastian Kocar,
Survey PanelsRepresentationCosts
https://doi.org/10.29115/SP-2023-0014
Photo by Unseen Studio on Unsplash
Survey Practice
Lavrakas, Paul J., and Sebastian Kocar. 2023. “A Low-Cost Method to Try to Improve Panel Survey Representation.” Survey Practice 16 (1). https:/​/​doi.org/​10.29115/​SP-2023-0014.
Save article as...▾

View more stats

Abstract

Survey researchers are perpetually faced with the challenge of trying to balance quality, time, and cost. However, it is essentially impossible for surveys to achieve high quality, a quick turnaround time, and low costs. So, most often researchers settle for trying to achieve two of these three. As survey costs have escalated, funders have become unwilling to keep spending more on surveys, and what often happens is that quality is compromised. It is within this context that we propose a low-cost approach to improve panel survey quality by increasing a panel’s representation of its target population. This includes both the quality of the initial sample recruited to join the panel and the sample that remains active within the panel. Our approach addresses a way to raise the representativeness of initial and on-going panel samples essentially without raising on-going costs. We present a case study of this approach, using the Life in AustraliaTM panel. We began by asking 1,557 panel members an open-ended question about why they joined and remained in the panel. We then content analyzed the responses to create quantitative data which could be analyzed statistically. Also, we gathered all the communications used with these panel members and performed a qualitative content analysis to identify the themes used to try to persuade the sampled panelists to join and stay active in the panel. We then compared the two sets of findings. There were six motivations that panelists reported that were not included in the recruitment and maintenance communications that were used with potential panelists and currently inactive panelists. We also found that panelists with certain background characteristics were more likely to report certain motivations for being in the panel. We acknowledge that our approach is not a panacea but believe it adds to the “toolbox” of panel companies.

Introduction

Survey researchers are perpetually faced with the three-fold challenge of trying to balance quality, time, and cost. However, it is essentially impossible for their surveys to achieve high quality, a quick turnaround time, and low costs. So, most often researchers, including those that manage panels, appear to settle for trying to achieve two of these three and appear to be willing accept that, in the words of the old Meatloaf song, “Two Out of Three Ain’t Bad” (Aday and Steinman 1977).

As survey costs have continued to escalate in the past three decades, most survey funders have become unwilling to keep spending more on surveys, and so what often happens is that recruitment quality is compromised for lower cost and quicker turnaround times. Yet we believe that this does not necessarily have to be the case if more researchers were to heed the implications of leverage salience theory more closely (cf. Groves, Singer, and Corning 2000) when devising the recruitment protocols used in their surveys. The leverage salience theory is based on the premise that different respondents/panelists have different sets of motivations for agreeing to complete survey questionnaires, and that they are likely to respond differentially to the various persuaders that are used by the researchers to gain their initial and ongoing cooperation.

It is within this context that we propose a low-cost approach reasoned to improve panel survey quality by increasing a panel’s representation of its target population. Our approach involves gathering open-ended verbatims from panelists about why they joined and have stayed active in the panel. And then reviewing and revising the communications that the organization uses with new panelists by conducting content analyses of both the verbatims and the existing communications that the panel’s organization uses for the recruitment and the maintenance of its panelists. That is, our approach primarily addresses a way to identify the persuasive communications that should be used to help raise the representativeness of initial and ongoing panel samples essentially without raising ongoing costs. Furthermore, although some of these communicated motivators/persuaders are likely to appeal to essentially all sampled panelists, other persuaders may appeal more to certain types of panelists. This “tailored” approach is highly consistent with leverage salience theory.

What follows is a case study demonstration of our approach for identifying the most effective set of persuasive appeals to use when recruiting for one’s panel and for maintaining membership in it. We do this by using research with the Life in AustraliaTM panel of the Social Research Centre (SRC) at Australian National University. Of note, although our approach uses data from one Australian probability-based panel, we believe that it is applicable to any panel. However, we readily acknowledge that we have not been able to test whether desired outcomes are eventually achieved. The latter will become known if, and when, survey organizations formally experiment with the approach that we propose.

In devising our mixed methods approach, we decided that both quantitative and qualitative research would be needed. Building on past research on motivations to participate in online panel research (e.g., Bosnjak, Tuten, and Wittmann 2005; Brüggen et al. 2011; Han et al. 2009), we began by asking panel members an open-ended question, as part of the questionnaire used in the 28th wave of data collection from the panel, as to why they joined and remained in the SRC panel. We then content analyzed the verbatim written responses to create quantitative data which could be analyzed statistically. In addition, we assembled all the communications that had been used with these SRC panel members (including those for recruitment and maintenance) and then performed a qualitative content analysis to identify the themes that the SRC used to try to persuade the potential and existing panelists to join and stay active in the panel (cf. Kocar and Lavrakas 2023).

Of note, Dillman (2020) has argued that knowing what to include in recruitment communications, as well as when (and with whom) to use those communications, is an understudied area in countering nonresponse. Furthermore, the existing research on the effectiveness of recruitment communications has been mostly limited to cross-sectional surveys (excluding panel maintenance communication), to recruitment text length and placement (e.g., Einarsson, Cernat, and Shlomo 2021; Martinsson, Dumitrescu, and Riedel 2017), to the application of social-psychological theory in communications (e.g., Greenberg and Dillman 2021), to the use of video (e.g., Kalleitner, Mühlböck, and Kittel 2022), and to the use of humor (e.g., Moore and Varghese 2019). The use of motivational factors in persuasive recruitment and maintenance communications (e.g., Martinsson, Dumitrescu, and Riedel 2017) has been less explored, especially when it comes to using a holistic approach, such as the one we are reporting here.

After we conducted both studies, we compared the two sets of findings. In cases where SRC panel members identified motivations for joining and staying active in the panel that the SRC did not currently use in communicating with potential and existing panelists, we recommend that the SRC consider incorporating these into their future communications with potential and existing panelists.

The approach we devised and carried out is consistent with other views of Dillman (2020), who opined that theories of survey participation (a) tend to be dated, (b) ignore certain techniques for increasing cooperation, (c) are limited to single-mode recruitment, and (d) ignore how the design of each survey contact message and all associated persuasive materials should be formulated. It is in the spirit of Dillman’s advice that we conceived of the approach that we describe and recommend be replicated by other panel organizations.

Of note, we also readily acknowledge that by being able to gather data only from existing panel members, and not being able to gather data from those who were sampled to join the Life in Australia™ panel but failed to join, or from those who attrited from the panel after originally joining, we have an incomplete understanding of how to enhance the representation of the panel’s target population more robustly.[1]

Methods

We carried out two content analyses for our research: one using mixed-methods (cf. Coe and Scacco 2017) and the other qualitative methods (cf. Roller and Lavrakas 2015). Each was related to motivations for potential and ongoing panel members to join and stay active in the Life in Australia™ panel, which is a mixed-mode recruited probability-based panel of the Australian adult population using online and telephone data collection with approximately 10,000 panelists as of September 2023. The panel was originally established by the SRC in 2015.

Quantitative Content Analysis. 2,000 Life in Australia™ panelists who responded to the June 2019 survey (Wave 28) were asked an open-ended item about their experiences participating in the panel:

We would like to understand why you chose to be part of Life in Australia™ and what, if anything, you value about being part of it. What does being part of Life in Australia™ mean to you and what motivates you to participate in the surveys?

1,715 panelists (86%) provided an answer, but only 1,557 (78% of the 2,000) consented to their data being used for our research purposes. Two random subsets of 100 verbatims each from the 1,557 verbatims were reviewed separately by the two authors (one set each) of this study. In comparing the coding themes which they independently identified, a high extent of agreement was found. This qualitative process led to 17 motivational categories being chosen. We then applied the 17 categories to each of the 1,557 verbatims and quantitatively coded as many as three motivations reported by each panelist. The intercoder reliability for this coding had an initial Kappa of .61. However, when there was a discrepancy between the initial coding of a panelist’s verbatim between each author-coder, they mutually discussed its resolution. Had more resources been available for this project, then a more formal (but also more costly) content analysis could have been carried out which would likely have yielded outcomes possessing greater reliability. However, the level of effort in which we were able to engage, in our view, was sufficiently fit for purpose, because we believe that it is unlikely that the outcomes from a more formal analyses would have had greater construct validity.

Qualitative Content Analysis. The 14 written and spoken communications devised/deployed by the SRC that were used with potential (used at the time of their recruitment) and ongoing panelists were qualitatively content analyzed by the two authors. For this, we gathered all of the SRC’s scripts and other communication materials that were used for panel recruitment/maintenance and did a qualitative content analysis of them to identify the motivational “themes” that they contained. That is, we separately read each communication and devised our own list of themes and then met to compare each other’s findings. The vast majority of the findings were essentially identical, and we easily resolved the few differences. We judged that a more formal (and more costly) content analytical approach was not needed, because the amount of content that was analyzed was very small (i.e., 14 communications, none of which were more than a few sentences in length).

Of note, we acknowledge that due to our very limited resources,[2] we could not carry out more rigorous content analyses, ones that would have been able to generate metrics to help assess the construct validity of the coding categories that we identified. Thus, our use of content analysis should be viewed as more exploratory in nature, as opposed to being well-suited to formal hypothesis testing.

After the two content analysis studies were completed, the persuasive themes that the SRC used for recruitment and maintenance of the Life in Australia™ panel were then compared with the categories for the self-reports of why panelists joined and remained active in the panel.

Results

Before we present the results from our analytics, we would like to remind the reader that the specific findings that we report may not generalize to other research panels. This, in itself, should not be viewed as a limitation of our research, because the primary purpose of presenting our case study is to show how the proposed method is organized and can be implemented. Thus, it is the external validity of the proposed method that we believe is strong, even if all the particular results from the Life in AustraliaTM panel data, as to which motivations were identified, do not generalize exactly to other panels. We reason that what is found in one panel about what motivates people to join and stay active in it, may not be the same as what motivates people to join and stay active in another panel, elsewhere in the world, possibly being used for other purposes than to characterize the attitudes and beliefs of a nation’s general population. Yet, we do not view this as a limitation of our method, as the method can be expected to generate findings that can be tailored to the particular panel to which the method is applied.

Quantitively coded motivations reported by panelists in the open-ended data

Table 1 presents the frequencies for the proportion of panelists who mentioned each of 17 motivations categories into which the open-ended data were coded; these are ordered by their frequency of being mentioned.

As shown in Table 1, four-fifths of the panelists mentioned at least one motivation that referred to their being pleased to be able to share their views and that they believed that someone was “listening” to them; i.e., motivational Codes 1-3. About one-fifth of the panelists expressed at least one motivation that referred to their being intellectually engaged whilst participating in the panel; i.e., motivational Codes 4, 9, and 11. A similar proportion of the panelists mentioned at least one motivation that referred to being gifted/valued for their input; i.e., motivational Codes 5 and 7. About one-eighth of the panelists mentioned at least one motivation about wanting to give/contribute to something they were part of (Codes 8, 10, and 13), and approximately one-twelfth mentioned doing something interesting with their time (Code 6). Other motivations, such as enjoying being part of a panel managed by a respected organization (Code 15) or sharing views/opinions via a nonjudgmental platform (Code 14), were less prevalent.

Table 1.Motivational Factors (Coded Open-Ended Question Answers) for Joining and/or Staying Active in Panel and Demographic Correlates.
Self-Reported Motivations from Coded Verbatims Pct. of Panelists Mentioning (n=1,557) Panelists’ Significant Background
Correlates (p < .05)
1 Self-actualization, allows my voice to be heard 35.8 NONE
2 Sharing views/opinions to make a difference or influence change 32.2 Males
3 Sharing views/opinions to represent others like me/population subgroups/minorities 12.0 ≥ Bachelors
4 Thought-provoking to participate in the panel 11.6 Females
5 My input is valuable 10.3 Live in Western Australia
6 Interesting topics 8.2 NONE
7 Receiving incentives 7.8 Younger adults
8 Contributing to the survey/study/research/science 5.8 ≥ Bachelors
9 Enjoying surveys, participating in research 5.5 Females
10 Donating incentives to charity 5.1 ≥ Bachelors
11 To be informed about topical issues 3.9 Females
12 Positive sentiment towards Australia 2.9 Older adults
13 Like being part of something, part of a team 2.3 ≥ Bachelors, Live in Western Australia
14 Sharing views/opinions in a non-judgmental platform 1.8 NONE
15 Social Research Centre/Life in Australia™ are reliable enterprises 1.3 ≥ Bachelors
16 Have the time/something to do 1.0 NONE
17 Other misc. motivation 5.0 NONE

A series of logistic regression analyses were conducted to determine panelists’ background characteristics,[3] if any, that were significantly related (p < .05) to whether a panelist mentioned a particular type of motivation. These results are shown in the right column of Table 1. The motivations for which there were no background correlates (shown as “NONE” in Table 1) were likely to be mentioned by all types of panelists, whereas motivations for which reliable predictors were identified were more likely to be mentioned by panelists with the listed characteristics. Those with higher educational attainment were more likely than those with lower educational attainment to mention Motivations 3, 8, 10, 13, and 15. Females were more likely than males to mention Motivations 4, 9, and 11, whereas males were more likely to mention Motivation 2. Age of the panelist was related to Motivations 7 and 12, but in opposite directions.

Comparing recruitment/maintenance strategies with self-reported motivations to join and participate in a panel

As noted, we also qualitatively content-analyzed all scripts and the other materials used by the SRC to recruit and maintain the panelists’ participation in the panel. As a result, we were able to compare the persuasive statements that the SRC used to try to build and maintain the panel against the quantitively coded self-reports on panelists’ participation motivations.

In comparing the two sets of motivations in the two columns in Table 2, all but one of the motivational reasons that were used by the SRC (see left column) were mentioned by the panelists using fairly similar wording in their verbatims (see right column). Of note, the one persuasive statement that the SRC used to recruit panelists that was not explicitly mentioned by the panelists themselves was, “Your participation is voluntary and you can drop out at any time.”

The persuasive statements used by the SRC in recruitment that matched up most frequently with the motivations that panelists mentioned in their verbatims (as shown in the right-hand column of Table 2) were:

  • # 1 – Sharing views/opinions to make a difference or influence change;

  • # 5 – My opinions are valued, appreciated, considered; and

  • # 14 – Contributing to the survey/study/research/science.

42% of the 1,557 panelists mentioned one of these three reasons and another 3% mentioned two of the reasons. However, the majority of panelists (55%) did not mention any of these three reasons for why they joined and had stayed active in the panel.

In addition, and of note, the panelists expressed six motivations (shown in Table 1, but not shown in Table 2) that were not part of what the SRC had communicated to them during recruitment and/or in follow-up contact to motivate panelists to complete survey wave questionnaires. These six motivations were

  • # 13 – Like being part of something, part of a team;

  • # 4 – Thought-provoking to participate in the panel;

  • # 11 – To be informed about topical issues;

  • # 9 – Enjoy surveys, participating in research;

  • # 6 – Interesting topics; and

  • # 12 – Positive sentiments toward Australia.

Table 2.Mapping Reasons in SRC Communications to Panelists’ Self-Reported Motivations.
No Motivational themes in recruitment/maintenance communications used by the SRC Panelist’ self-reported motivations to participate in the panel from verbatims (listed in Table 1)
1 You can influence Australian researchers, policy-makers, and academics 2 Sharing views/opinions to make a difference or influence change
8 Contributing to the survey/study/research/science
2 You will help others better understand Australia and Australians 2 Sharing views/opinions to make a difference or influence change
5 My opinions are valued, appreciated, taken into account
8 Contributing to the survey/study/research/science
3 Your views will be heard 1 Self-actualization, allows my voice to be heard
5 My opinions are valued, appreciated, taken into account
4 It’s an opportunity to share your views 1 Self-actualization, allows my voice to be heard
2 Sharing views/opinions to make a difference or influence change
3 Sharing views/opinions to represent others like me/population subgroups/minorities
14 Sharing views/opinions in a non-judgmental platform
5 Your views will be represented 3 Sharing views/opinions to represent others like me/population subgroups/minorities
5 My opinions are valued, appreciated, taken into account
6 You will gain incentives/rewards/make donations 7 Receiving incentives
10 Donating incentives to charity
7 Your input will be appreciated 5 My opinions are valued, appreciated, taken into account
8 Contributing to the survey/study/research/science
8 Your input is valuable 5 My opinions are valued, appreciated, taken into account
8 Contributing to the survey/study/research/science
2 Sharing views/opinions to make a difference or influence change
9 You will be participating in important national research 8 Contributing to the survey/study/research/science
2 Sharing views/opinions to make a difference or influence change
10 You are special - relatively few Australians get this invitation-only chance to join 5 My opinions are valued, appreciated, taken into account
3 Sharing views/opinions to represent others like me/population subgroups/minorities
11 This is a chance to participate in something innovative/novel/unique 8 Contributing to the survey/study/research/science
12 Your data are protected by Australian privacy laws and kept confidential 15 Social Research Centre/Life in Australia™ are reliable enterprises
13 Your participation is voluntary and you can drop out at any time NONE
14 You will be helping Australian National U. 8 Contributing to the survey/study/research/science
15 Social Research Centre/Life in Australia™ are reliable enterprises
15 You will be affiliated with something Australian National U. is doing 15 Social Research Centre/Life in Australia™ are reliable enterprises

Discussion

Panel representation is often out of balance and that causes myriad problems. Essentially all general population panels fail to recruit an unweighted membership that is closely representative of the population the panel purports to represent. And it can be surmised that in many, if not most cases, this leads to data that the panel generates that suffer from nonignorable noncoverage and nonresponses biases. Furthermore, in most cases, our experience indicates that panels do not provide their clients with reliable evidence that the statistical adjustments they make to try to improve representation actually have achieved that goal/claim.

Seeking low-cost solutions for this problem. A major reason that this imbalance in representing the target population occurs is because improving the unweighted representation of the panel can be very costly. Thus, researchers seek methods to improve representation that are attractive from a cost-benefit perspective.

Our approach is feasible and low-cost. The low-cost method that we have described is one that we believe could raise panel representation and one that most, if not all, panels could undertake. As we have noted, we had very limited resources to carry out our study, yet we believe that we were able to generate many useful outcomes while keeping costs quite low. The latter further demonstrates to other panel companies that one does not require a lot of funding to deploy our method. However, as we readily acknowledge, we have not as yet been able to formally test this method using an unconfounded experimental design. Such a design (which need not be costly) would take findings from a study such as the one we describe and revise/enhance the scripts and other materials that are used by a panel organization to recruit members and to maintain the participation of their panelists within the panel. Ideally, this would be implemented via an experimental design with a randomly chosen group of newly sampled respondents being recruited using the new approach, whereas the other randomly chosen group of newly sampled respondents would be recruited via the panel organization’s standard recruitment approach.

To carry out such an experiment, a panel organization would add an open-ended question or brief series of questions to a wave of data collection, likely at the end of one of their questionnaires. They would use content analysis to identify a meaningful, yet parsimonious, set of categories for the motivations that the panelists have reported about why they joined the panel and why they have stayed active in it. The organization then would investigate whether certain motivations were more likely to resonate with certain types of panelists, especially with panelists in demographic/psychographic cohorts that are underrepresented in the panel. The organization then would assemble all their recruitment and maintenance materials used to gain and maintain cooperation from panelists and content analyze those to identify the motivational themes in the communications they use with panelists. They then would identify the agreement/correspondence, or lack thereof, between the two sets of motivations. The experiment would be implemented when recruiting a new group of panelists. Each potential panelist would be randomly assigned to either the old/standard recruitment/maintenance communications or the new set of communications that are based on what was learned from the content analyses. The main dependent variables in the experiment would be (a) the initial rate at which people joined the panel and (b) their longer-term attrition rate from the panel. The researcher-controlled independent variable would be to which of the recruitment protocols was a potential panelist subjected.

We demonstrated that there are a manageable set of motivations to focus upon in creating communications to use with potential panelists. The approach that we have described identified a manageable set of self-reported reasons that panelists joined and stayed active in the Life in AustraliaTM panel. It is often the case that the same population of panelists will continue to be sampled by, and recruited for, the panel. It is reasonable to expect that the motivations that are identified in a study such as ours, will apply during future recruitment efforts by a panel. However, it would be prudent for a panel organization to repeat the study within a few years to learn if any important new motivations have arisen to lead people to join their panel or if any previously identified motivations are no longer relevant.

Some self-reported motivations were not being used by the SRC. In our study, there were six motivations that panelists reported that were not included in the recruitment and maintenance communications that the SRC used with potential panelists and currently inactive panelists. Although we were not able to test the use of these six motivations, it is reasonable to expect them to resonate with at least some of the people who are contacted to join the panel in the future. We advised the SRC that they test our recommendation to somewhat modify their recruitment and maintenance motivational communications.

Some motivations can be expected to apply to all panelists, whereas others are likely to apply more to certain types of panelists. Since panels have myriad background variables on their panelists, these analyses are feasible to conduct. We found that panelists with certain background characteristics were more likely to report certain motivations for joining and participating in the panel. Thus, one could envision a process whereby panel operators could tailor/target specific appeals to certain types of potential and current panelists during recruitment and maintenance efforts. This tailored process would be highly consistent with the leverage saliency theory (Groves, Singer, and Corning 2000) and with Dillman et al.'s (2014) tailored design method.

Furthermore, as shown in Table 1, some of the expressed motivations were found to correlate with demographic cohorts that are often underrepresented in the final samples of many surveys. Those cohorts include males and young adults. Had we been able to receive a much more robust set of demographic data from the SRC, we expect we would have identified motivations that are associated with other demographic cohorts that are traditionally under-represented in panels. By identifying groups of panelists (e.g., males) who report particular reasons to have joined the panel (e.g., “sharing views and opinions that make a difference or influence change”), survey organizations could target these reasons for joining a panel as part of their recruitment of members of traditionally under-represented demographic groups.

This approach is not a panacea but adds to the “toolbox” of panel organizations. Although our approach has not been tested experimentally, we believe that it will not harm recruitment and maintenance efforts, and likely will help at least to some extent. We encourage panel organizations to implement the approach that we described and openly disclose what they find. Only in this way will the actual value of our approach become known to survey practitioners.

There are limitations to the research that we have reported. First, we were unable to gather open-ended data from nonrespondents and attritors about the reasons they were unwilling to join the panel or remain active in the panel. Knowing such information would likely be very useful in deciding upon the adequacy of the communications a panel organization uses to gain/maintain participation. To this end, there are ways to gather such information from those sampled for panels but who do not join or do not stay active once they join. For example, brief exit interviews with attritors and asking a few (1-3) quick questions of nonrespondents at the time of their refusal—similar to the use of a refusal report form as recommended by Lavrakas (1993)—about why they are unwilling to join or remain active in the panel, should provide very useful data. But even without such data from those who did not originally join the panel and those who did not stay active in the panel, we believe that the data we gathered in our study still have value.

Second, we have reported the findings from one study of one panel and with quite limited resources available to support our study. Until other studies by other panel organizations are carried out, it will be unclear how the value of our approach generalizes to other panels. Related to this, and as we previously noted, due to the budget/resource constraints that we faced, we carried out somewhat anemic content analyses that may not have yielded findings with strong construct validity. That is, the sets of categories that we identified in our content analyses may suffer from measurement error to some extent. But, this notwithstanding, we believe that we have demonstrated that even under such funding constraints that valuable and actionable results can be achieved.

Finally, in understanding the value of the open-ended data that we gathered from panelists, it is important to recall that the 1,557 panelists had been exposed to the persuasive communications used by the Life in AustraliaTM panel prior to being asked to provide their open-ended verbatims of motivational influences, albeit in many cases this occurred 4-5 years earlier. Thus, a possibility is that when asked what motivated them to be part of the panel, the panelists’ answers were influenced by what they may have accurately recalled hearing or reading from the SRC about why they should join and stay active in the panel.

The need for future research. As we have noted, data such as those we gathered, are needed from nonrespondents and attritors as to why they fail to join panels or stay active in panels. This includes data from qualitative research such as in-depth interviews (cf. Kocar and Lavrakas 2023) and focus groups. We also believe that panel organizations will benefit if they start to gather data whenever they encounter a refusal when recruiting for their panel, for example, by having interviewers ask a few questions of the nonrespondents about their reasons for not joining. Granted, the nonresponse to those nonresponse follow-up questions will be considerable, but we are sanguine about the prospect that enough useful and valuable data will be provided by the nonrespondents to justify the effort.


Lead Author Contact Information

Paul J. Lavrakas, Ph.D.
2542 Princeton Avenue
Evanston IL 60201 USA
224-307-2654

ACKNOWLEDGEMENTS

We appreciate the generosity of, and help provided by, the Social Research Centre (SRC) of Australian National University for their willingness to gather and share with us the data used in this study and provide us with copies of the panel recruitment and panel maintenance communications used for their Life in AustraliaTM panel. We especially appreciated the help provided by Ben Phillips and Dina Neiger of the SRC. We also appreciate the feedback and helpful suggestions made by the Survey Practice reviewers and editors.


  1. We were not able to gather such additional data as that was beyond the scope of the assistance that the SRC was able to provide to our project.

  2. No one funded us to do the coding, the analyses, or to write the manuscript.

  3. The SRC was only able to provide us panelist background characteristics for gender, age, education, and region of residence.

Submitted: December 13, 2022 EDT

Accepted: August 09, 2023 EDT

References

Aday, M.L. (aka Meatloaf), and J. Steinman. 1977. Two Out of Three Ain’t Bad. Bat Out of Hell. Epic Records.
Bosnjak, Michael, Tracy L. Tuten, and Werner W. Wittmann. 2005. “Unit (Non)Response in Web-Based Access Panel Surveys: An Extended Planned-Behavior Approach.” Psychology and Marketing 22 (6): 489–505. https:/​/​doi.org/​10.1002/​mar.20070.
Google Scholar
Brüggen, Elisabeth, Martin Wetzels, Ko De Ruyter, and Niels Schillewaert. 2011. “Individual Differences in Motivation to Participate in Online Panels: The Effect on Response Rate and Response Quality Perceptions.” International Journal of Market Research 53 (3): 369–90. https:/​/​doi.org/​10.2501/​ijmr-53-3-369-390.
Google Scholar
Coe, Kevin, and Joshua M. Scacco. 2017. “Content Analysis, Quantitative.” In The International Encyclopedia of Communication Research Methods, 1–11. Hoboken NJ: Wiley-Blackwell. https:/​/​doi.org/​10.1002/​9781118901731.iecrm0045.
Google Scholar
Dillman, D. 2020. “Towards Survey Response Rate Theories That No Longer Pass Each Other like Strangers in the Night.” In Understanding Survey Methodology, 15–44. Cham: Springer. https:/​/​doi.org/​10.1007/​978-3-030-47256-6_2.
Google Scholar
Dillman, D., J. Smyth, and L. Christian. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 4th ed. Hoboken NJ: John Wiley.
Google Scholar
Einarsson, H., A. Cernat, and N. Shlomo. 2021. “Reducing Respondent Burden with Efficient Survey Invitation Design.” Survey Research Methods 15 (3): 207–33.
Google Scholar
Greenberg, Pierce, and Don Dillman. 2021. “Mail Communications and Survey Response: A Test of Social Exchange Versus Pre-Suasion Theory for Improving Response Rates and Data Quality.” Journal of Survey Statistics and Methodology 11 (1): 1–22. https:/​/​doi.org/​10.1093/​jssam/​smab020.
Google Scholar
Groves, Robert M., Eleanor Singer, and Amy Corning. 2000. “Leverage-Saliency Theory of Survey Participation.” Public Opinion Quarterly 64 (3): 299–308. https:/​/​doi.org/​10.1086/​317990.
Google Scholar
Han, Vallen, Gerald Albaum, James B. Wiley, and Peter Thirkell. 2009. “Applying Theory to Structure Respondents’ Stated Motivations for Participating in Web Surveys.” Qualitative Market Research: An International Journal 12 (4): 428–42. https:/​/​doi.org/​10.1108/​13522750910993338.
Google Scholar
Kalleitner, Fabian, Monika Mühlböck, and Bernhard Kittel. 2022. “What’s the Benefit of a Video? The Effect of Nonmaterial Incentives on Response Rate and Bias in Web Surveys.” Social Science Computer Review 40 (3): 700–716. https:/​/​doi.org/​10.1177/​0894439320918318.
Google Scholar
Kocar, Sebastian, and Paul J. Lavrakas. 2023. “Social-Psychological Aspects of Probability-Based Online Panel Participation.” International Journal of Public Opinion Research 35 (2): 1–8. https:/​/​doi.org/​10.1093/​ijpor/​edad012.
Google Scholar
Lavrakas, Paul J. 1993. Telephone Survey Methods: Sampling, Selection, and Supervision. 2nd ed. Thousand Oaks, CA: Sage. https:/​/​doi.org/​10.4135/​9781412986083.
Google Scholar
Martinsson, J., D. Dumitrescu, and K. Riedel. 2017. “Recruiting an Online Panel from Another Online Survey: Consequences of Framing and Placement of the Recruitment Question.” International Journal of Public Opinion Research 29 (2): 339–51.
Google Scholar
Moore, Raeal, and Lebena Varghese. 2019. “An Examination of the Role of Funny Reminders and Embedded Survey Items on Response Rate.” Social Science Computer Review 39 (1): 162–72. https:/​/​doi.org/​10.1177/​0894439319852893.
Google Scholar
Roller, M., and P.J. Lavrakas. 2015. Applied Qualitative Research Design: A Total Quality Framework Approach. New York: The Guilford Press.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system