Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:7269/feed
Articles
Vol. 16, Issue 1, 2023August 10, 2023 EDT

Give them a call! About the importance of call-back strategies in panel surveys

Blazej Palat, Marion Elie, Selma Bendjaballah, Guillaume Garcia, Nicolas Sauger,
online panelsurveynon-responsewave-specific participationpanel management
https://doi.org/10.29115/SP-2023-0009
Photo by Annie Spratt on Unsplash
Survey Practice
Palat, Blazej, Marion Elie, Selma Bendjaballah, Guillaume Garcia, and Nicolas Sauger. 2023. “Give Them a Call! About the Importance of Call-Back Strategies in Panel Surveys.” Survey Practice 16 (1). https:/​/​doi.org/​10.29115/​SP-2023-0009.
Save article as...▾
Download all (3)
  • Figure 1. Schematic representation of ELIPSS survey fieldwork
    Download
  • Figure 2. Response rates as a function of the panellists’ status and the experimental treatment. Vertical bars indicate 95% confidence intervals.
    Download
  • Table 5. Post-hoc comparisons of the response propensity between the groups formed by the interaction between the panellists’ status and the experimental treatments.
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

Building on an experiment introduced in the French probabilistic web panel Longitudinal Internet Studies for Social Sciences (ELIPSS), this paper provides estimates of the effect of a callback strategy of nonrespondents to specific waves to increase overall participation. Comparing groups in accordance with their previous pattern of participation and a treatment deciding whether a telephone callback was implemented in case of nonresponse, we test the conditionality of the callback effect on previous participation patterns. The panellists’ probability of response decreased proportionally to the number of studies missed out, while the motivating effect of telephone callbacks seemed independent from this factor. Hence, this paper lends credence to the assumption that the effectiveness of callback strategies is quite stable irrespective of the panellists’ level of commitment.

1. Introduction

While probability-based panels dedicated to social research are operating now in many countries (Blom et al. 2016), the daily management of such panels leaves open many questions about how to best allocate the resources dedicated to these infrastructures to offer an optimal quality of service. In the survey tradition, service is largely defined by the internal and external validity of panel data, which depends on several factors including the quality of the sample. Much has been said on the impact of the initial recruitments in probability-based panels; attrition has also received a lot of attention (Clinton 2001; Lynn et al. 2005; Das, Toepoel, and van Soest 2011; Genoni et al. 2021). Less insights have been proposed into what happens in between, i.e., the participation of panellists in each of the specific waves.

Panellists’ propensity to respond to a specific study in a panel varies across individuals (Behr, Bellgardt, and Rendtel 2005; Fitzgerald, Gottschalk, and Moffitt 1998; Voorpostel and Lipps 2011; Lugtig, Das, and Scherpenzeel 2014; Voorpostel 2010). For instance, there are groups of almost perfectly reliable and reactive panellists who respond systematically; there are also those who are likely to drop out early, and different patterns of more or less erratic behavior (Lugtig 2014). Most of the time, nonparticipation and, eventually, attrition result from conscious decision-making from panellists. Among different psychosociological factors involved in this process, commitment to the panel study is crucial. For instance, the fact that the participation in the previous wave of a panel predicts best the participation in the next wave reflects the effect of commitment. One can consider various mental processes underlying it, such as self-identification, social pressure, or habit. Whatever these mechanisms are, commitment is important for online panels that are found to be more subject to attrition than offline panels. Online panels usually solicit their participants more often to respond to surveys (Lugtig, Das, and Scherpenzeel 2014), and the absence of interviewers makes the relationship between the panel and the panellist impersonal. Thus, a systematic telephone callback of nonrespondents by panel managing staff seems to be vital for maintaining satisfactory response rates in online panels (Blom, Gathmann, and Krieger 2015; Roscoe, Lang, and Sheth 1975).

Such an approach being time- and energy-consuming, insights about best practices seem to be particularly valuable. Many decisions pertaining to setting up an efficient telephone callback protocol are constrained by the availability of financial resources (and therefore staff) and the panel management agenda. The experimental protocol exposed in the present paper had two aims: testing the effectiveness of a telephone callback protocol and examining what profiles of the panellists are the most receptive to telephone callbacks in a noncommercial online survey panel. In doing so, we focused especially on the measures reflecting the panellists’ commitment.

However, as evidenced by existing literature, there are many streams of nonparticipation. Bad health (Goldberg et al. 2001), low level of education (Alderman et al. 2001), poor economic status (Burkam and Lee 1998; McCurdy, Mroz, and Gritz 1998; Russell 2002), and weak cognitive capacities (Botwinick and Siegler 1980) seem to be correlated with the probability of attrition. Moreover, gender differences also seem to be related to this phenomenon, insofar as females tend to be less prone to attrition than males, although this seems to be controversial (Behr, Bellgardt, and Rendtel 2005; Lepkowski and Cooper 2002; Uhrig 2008). Finally, extreme age categories, especially the eldest, would be more prone to attrition (Genoni et al. 2021; Hayslip, McCoy-Roberts, and Pavur 1999; Voorpostel and Lipps 2011). Conversely, Goodman and Blum (1996) found that married, elderly, and educated Caucasians were particularly likely to stay in the panel in the long run. Therefore, in the experimental setup exposed in this paper, we controlled either for these exact characteristics, or at least for their proxies.

2. Methods

This study is based on a French panel Longitudinal Internet Studies for Social Sciences (ELIPSS). Based on random sampling in census or fiscal data, it is representative of the adult population (over 18) living in France. It was established in 2012 and then refreshed twice (in 2016 and 2020). The initial recruitment in 2012 (27% of the sampled individuals) and 2020 (14%) each used a slightly different mix of recruitment strategies (postal mail, email, telephone, website, face-to-face, and an occasional use of 10€-voucher incentives), whereas 2016 recruitment (32%) was solely face-to-face with a telephone follow-up.

ELIPSS is dedicated exclusively to research in social sciences, and “interest in research” is one of the motives for joining the panel that are the most often mentioned by its new panellists on the occasion of the introductory survey. Moreover, the image of the panel as a research facility benefits from the reputation of its parent institutions: Sciences Po and French National Centre for Scientific Research. The panellists are solicited once per month to respond to a study that needs on average about 20 minutes to fill in. Topics covered by the studies are very diversified (politics, arts, leisure activities, social subjects, etc.). In addition, every year, a specific survey refreshes all the profile information on the panel respondents. Special open-ended items are included in each questionnaire to let the respondents give feedback about their impressions and difficulties.

Appropriate care is taken to inform the panellists about their rights resulting from the General Data Protection Regulation and to provide necessary assistance if needed. For that purpose, they have the possibility to contact panel management staff by email or by telephone (normal call charges, available five days a week). Each year, at back-to-school time and at the end of year, wishes are sent to the panellists by postal mail together with a small sum (10€) voucher. There was a 21.4% attrition of the original panel at the moment of the refreshment in 2016; the attrition of the panellists recruited in 2016 reached 60.6% in 2020.

Until 2019, every ELIPSS panellist was equipped with a tablet with a dedicated application installed, and a 4G internet connection to respond to studies administered to the panel. This measure was discontinued afterward. Since then, ELIPSS panellists have responded to surveys using their own devices. If necessary, panel managers provided telephone assistance for opening contact email accounts and/or finding internet access in the vicinity of the panellist’s place of living (for example, in a library). The panellists could access the questionnaires by following links integrated in invitations sent by emails. Those who have not responded have been systematically reminded of the ongoing study, and of their commitment to the panel by email and by phone. More specifically, three statuses of panellists have been defined at the beginning of each study fieldwork:

  • Active – has responded to the preceding study.

  • Dozing off – non-respondent only to the preceding study.

  • Invisible – non-respondent to the two preceding studies in a row

All the non-respondents to a survey have been reminded by email two times. In addition, panellists in the dozing off and invisible categories have also been called back on the telephone—the former on the penultimate, and the latter on the second week of the fieldwork. The panellists were contacted using the phone numbers made available to panel managers, once on each number. Nonrespondents to more than two preceding studies in a row were moved into the “super invisible” category. They did not receive telephone reminders anymore but continued to receive email invitations and reminders. In addition, they were notified by postal mail about their initial commitment to the panel and invited to respond to oncoming survey invitations. If they remained nonrespondent to the following waves, they were thanked for their participation and removed from the panel. Figure 1 depicts the lifecycle of an ELIPSS survey.

Figure 1
Figure 1.Schematic representation of ELIPSS survey fieldwork

In 2021, an experimental protocol was set up to test the efficacy of telephone callbacks to reduce non-response. As the beginning of the experiment coincided with a panel refreshment, the panellists were first divided into two categories: those who were present in the panel before the last recruitment, and those who were recruited recently. Then, within each category, panellists were randomly assigned to one of the two experimental treatments, thus forming 4 groups in total (see Table 1).

Table 1.Distribution of the panellists with regard to their period of recruitment and experimental condition.
Experimental Condition 1 (Email only) Experimental Condition 2 (Email + Telephone)
Panellists Recruited in 2020 340 352
Panellists Recruited before 2020 610 617

The treatment in Condition 2 (email + telephone) corresponded to the normal callback operating mode, whereas the treatment in Condition 1 (email only) was somewhat heterogeneous. Those who had participated in ELIPSS before 2020 had their telephone reminders suspended (at least some of them could have already received a telephone callback before the setting up of the experimental protocol), while those who were recruited in 2020 had never been called back on the telephone to remind them of an ELIPSS study.

Eight studies were conducted during the implementation of the experimental protocol. At the beginning of the first study, all the panellists were considered as systematically participating in the studies proposed to them. The panel was purged at that time from former inactive panellists when the new panellists were integrated. Hence, the first panellists populating the dozing off category appeared at the beginning of the second study, while the full range of statuses was available only at the beginning of the third. Therefore, the analysis featured in the following sections pertains to the response rates collected only during six out of the eight studies. The first two studies could also be considered as a period of dishabituation with regard to telephone callbacks for Condition 1 panellists recruited before 2020.

3. Analyses

3.1. Telephone callbacks

A significantly greater proportion of the invisible were effectively called back (χ²(1) = 33.8, p < 0.0001; see Table 2 for details). This could reflect the simple fact that the dozing off were called back later during the study fieldwork than the invisible, so they had more time to respond before the callback. A slightly greater proportion of the dozing off compared to the invisible responded to surveys following those callbacks, but this difference did not reach statistical significance (see Table 3 for details).

Table 2.Panellists called back by their status.
Status Callback absent Callback present Row Total
Invisible 25 170 195
Row % 12.8 87.2 -
Column % 11.3 31.6 -
Total % 3.3 22.4 25.7
Dozing off 196 368 564
Row % 34.8 65.2 -
Column % 88.7 68.4 -
Total % 25.8 48.5 74.3
Column Total 221 538 759
% 29.1 70.9 -
Table 3.Respondents and nonrespondents among called back panellists by their status.
Status Non-respondent Respondent Row Total
Invisible 98 72 170
Row % 57.6 42.4 -
Column % 35.1 27.8 -
Total % 18.2 13.4 31.6
Dozing off 181 187 368
Row % 49.2 50.8 -
Column % 64.9 72.2 -
Total % 33.6 34.8 68.4
Column Total 279 259 538
% 51.9 48.1 -

Given the difference of responses obtained from the dozing off and the invisible between groups 1 and 2 (137), the number of 853 callbacks issued for a total amount of 855€ (57 hours of work paid 15€ per hour), the estimated cost of one supplementary response was 6.2€.

Response behavior

From the protocol previously described, we observed the participation behavior of 1,919 panellists to six surveys, i.e., 10,978 decisions to respond to an ELIPSS survey (excluding the cases where panellists had been moved into the super invisible category). Besides our treatment, we took into consideration six other independent variables: the previous response pattern (status), the period of recruitment in the panel (recruitment wave), age (in seven groups), education (in four groups), gender, and area of residence (in five groups) to explain response behavior.

We fitted a logistic regression with robust standard errors to adjust for within-subject repeated measures of response. The results (see Table 4 for details) make it possible to confirm a strong and statistically significant principal effect of the panellists’ status (response rate of 89.9%, 54.3%, and 35.1% for the active, dozing off, and invisible, respectively) as well as a weaker principal effect of the experimental treatment (response rate of 85.3% and 82.9% for Condition 2 and 1, respectively) on the dependent variable.

Table 4.Estimates of the logistic regression model - Type III effects.
LR Chi² Df p Value <
Status 1082.798 2 0.0001 ***
Experimental Condition 36.424 1 0.0001 ***
Recruitment Wave 6.846 1 0.0089 **
Age Group 168.316 1 0.0001 ***
Level of Education 11.689 1 0.0006 ***
Gender 6.557 1 0.0104 *
Area-of-Residence Degree of Urbanization 0.743 1 0.3887
Status x Experimental Condition 39.739 2 0.0001 ***
Status x Recruitment Wave 2.046 2 0.3595
Experimental Condition x Recruitment Wave 2.781 1 0.0954
Status x Experimental Condition x Recruitment Wave 0.423 2 0.8094

As the interaction between those two variables was also significant, we focused on the analysis of that complex effect rather than on the principal effects of both variables. For that purpose, we conducted post-hoc pairwise Tukey comparisons of response propensity between the groups resulting from the interaction (see Figure 2, and Table 5 in the Appendix for details). We found that the active panellists had the strongest response propensity irrespective of the experimental treatment (response rate of 89.6% in Condition 2 vs. 90.2% in Condition 1) that, actually, made a difference only for the dozing off (62.9% in Condition 2 vs. 45% in Condition 1) and the invisible (44.1% vs. 28.1%).

Figure 2
Figure 2.Response rates as a function of the panellists’ status and the experimental treatment. Vertical bars indicate 95% confidence intervals.

In addition to the main effects of interest, several control variables also had a significant effect. Response propensity increased with age (odds ratio [OR] = 1.31, 95% confidence intervals [CI] [1.25, 1.37]) and decreased with level of education (OR = 0.9, 95% CI [0.84, 0.97]). Females were found more reactive to study invitations than males (84.8% vs. 83.4% response rate). The response rate of the panellists recruited before 2020 was better (85.6%) than that of those recruited shortly before the experiment (81.4%).

4. Discussion and conclusion

We analyzed the impact of telephone callbacks on ELIPSS panellists’ response propensity within a limited period of panel activity, while controlling for a set of behavioral and sociodemographic characteristics. The results of the experiment confirmed the effectiveness of telephone callback protocol in boosting response rates as opposed to mere email reminders of a pending survey and therefore limiting wave-specific nonparticipation. An effective response to a preceding study was the best predictor of responding to the current one, irrespective of whether the panellist could have been called back on the telephone to respond to one of the previous studies or not. Once this rhythm was broken as a study was not responded to, the probability of further nonresponse increased, and telephone callbacks became useful. However, even though the probability of response decreased for an invisible panellist (who missed the two preceding studies) compared to a dozing off (only one study missed), the effectiveness of telephone callbacks was similar for both statuses. Furthermore, we could also reproduce some effects known from the literature on survey and panel nonresponse, such that panellists’ age and level of education are generally correlated to their response rates and that females are slightly more committed than males.

From a practical standpoint, different retention strategies involving personal contact can further be conceived. For example, we could think about how to make inactive panellists (here, the super invisible category) renew their commitment. One can imagine a strategy consisting of contacting them after a longer delay (in the case of panels similar to ELIPSS, after missing more than four studies for example) in order to persuade them of the importance of regularly responding to surveys. Such an approach should be designed based on proven persuasion techniques. For example, benefits of participating should be emphasized, and inconveniences downplayed, social norms in favor of participating should be made salient, and the perceived control over one’s response behavior should be reinforced (Ajzen 1991). Once the intention to participate is reconfirmed by the panellist, appropriate implementation intentions techniques could also be envisaged (Gollwitzer and Sheeran 2006).

Otherwise, our results highlight the importance of panel refreshments aimed at completing the profiles that are the most prone to attrition. In fact, those profiles seem very hard to maintain in the long run, even with such costly and time-consuming approaches as telephone callbacks, unless those approaches are appropriately readjusted as suggested previously. On the other hand, panel managers should also be aware and careful not to produce Hawthorne effects (Dickson and Roethlisberger 2003) that could eventually affect response behavior of people targeted by exceptional procedures aimed at retaining them in the panel.

Despite their undoubted interest, the results of our study should be interpreted with caution because of the short period covered by the experimental treatments. In fact, within the current framework, one remains unable to make predictions about possible consequences of the wave-specific nonresponse such as attrition. For the same reason, panellists’ typology used in the analyses could also be considered as quite limited compared to some in-depth studies of panellists’ profiles (Lugtig 2014). Future research on similar topics should cover longer periods of data collection to overcome those limitations.


Acknowledgment

The research presented in this paper benefited from a grant (no. 20DATA-002-0/20DATA-002-1) of the French National Public Health Agency (Santé Publique France).

Submitted: March 29, 2023 EDT

Accepted: July 06, 2023 EDT

References

Ajzen, Icek. 1991. “The Theory of Planned Behavior.” Organizational Behavior and Human Decision Processes 50 (2): 179–211. https:/​/​doi.org/​10.1016/​0749-5978(91)90020-t.
Google Scholar
Alderman, Harold, Jere Behrman, Hans-Peter Kohler, John A. Maluccio, and Susan Watkins. 2001. “Attrition in Longitudinal Household Survey Data: Some Tests for Three Developing-Country Samples.” Demographic Research 5 (4): 79–124. https:/​/​doi.org/​10.4054/​demres.2001.5.4.
Google Scholar
Behr, Andreas, Egon Bellgardt, and Ulrich Rendtel. 2005. “Extent and Determinants of Panel Attrition in the European Community Household Panel.” European Sociological Review 21 (5): 489–512. https:/​/​doi.org/​10.1093/​esr/​jci037.
Google Scholar
Blom, Annelies G., Michael Bosnjak, Anne Cornilleau, Anne-Sophie Cousteaux, Marcel Das, Salima Douhou, and Ulrich Krieger. 2016. “A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe.” Social Science Computer Review 34 (1): 8–25. https:/​/​doi.org/​10.1177/​0894439315574825.
Google Scholar
Blom, Annelies G., Christina Gathmann, and Ulrich Krieger. 2015. “Setting Up an Online Panel Representative of the General Population: The German Internet Panel.” Field Methods 27 (4): 391–408. https:/​/​doi.org/​10.1177/​1525822x15574494.
Google Scholar
Botwinick, Jack, and Ilene C. Siegler. 1980. “Intellectual Ability among the Elderly: Simultaneous Cross-Sectional and Longitudinal Comparisons.” Developmental Psychology 16 (1): 49–53. https:/​/​doi.org/​10.1037/​0012-1649.16.1.49.
Google Scholar
Burkam, David T., and Valerie E. Lee. 1998. “Effects of Monotone and Nonmonotone Attrition on Parameter Estimates in Regression Models with Educational Data: Demographic Effects on Achievement, Aspirations, and Attitudes.” The Journal of Human Resources 33 (2): 555–74. https:/​/​doi.org/​10.2307/​146441.
Google Scholar
Clinton, Joshua. 2001. “Panel Bias from Attrition and Conditioning: A Case Study of the Knowledge Networks Panel.” In AAPOR 55th Annual Conference. Montreal. https:/​/​cdn.vanderbilt.edu/​vu-my/​wp-content/​uploads/​sites/​349/​2011/​10/​14093738/​C_WP2001.pdf.
Google Scholar
Das, Marcel, Vera Toepoel, and Arthur van Soest. 2011. “Nonparametric Tests of Panel Conditioning and Attrition Bias in Panel Surveys.” Sociological Methods & Research 40 (1): 32–56. https:/​/​doi.org/​10.1177/​0049124110390765.
Google Scholar
Dickson, William J., and F. J. Roethlisberger. 2003. Management and the Worker. 1st ed. London: Routledge. https:/​/​doi.org/​10.4324/​9780203503010.
Google Scholar
Fitzgerald, John, Peter Gottschalk, and Robert Moffitt. 1998. “An Analysis of Sample Attrition in Panel Data: The Michigan Panel Study of Income Dynamics.” Journal of Human Resources 33 (2): 251–99. https:/​/​doi.org/​10.2307/​146433.
Google Scholar
Genoni, Andreas, Jean-Philippe Décieux, Andreas Ette, and Nils Witte. 2021. “Setting up Probability-Based Online Panels of Migrants with a Push-to-Web Approach: Lessons Learned from the German Emigration and Remigration Panel Study (GERPS).” In The Global Lives of German Migrants: Consequences of International Migration Across the Life Course, edited by Marcel Erlinghagen, Andreas Ette, Norbert F. Schneider, and Nils Witte, 289–307. Cham: Springer International Publishing. https:/​/​doi.org/​10.1007/​978-3-030-67498-4_16.
Google Scholar
Goldberg, P., A. Guéguen, A. Schmaus, J.P. Nakache, and M. Goldberg. 2001. “Longitudinal Study of Associations between Perceived Health Status and Self Reported Diseases in the French Gazel Cohort.” Journal of Epidemiology & Community Health 55 (4): 233–38. https:/​/​doi.org/​10.1136/​jech.55.4.233.
Google ScholarPubMed CentralPubMed
Gollwitzer, Peter M., and Paschal Sheeran. 2006. “Implementation Intentions and Goal Achievement: A Meta-Analysis of Effects and Processes.” Advances in Experimental Social Psychology 38:69–119. https:/​/​doi.org/​10.1016/​s0065-2601(06)38002-1.
Google Scholar
Goodman, Jodi S., and Terry C. Blum. 1996. “Assessing the Non-Random Sampling Effects of Subject Attrition in Longitudinal Research.” Journal of Management 22 (4): 627–52. https:/​/​doi.org/​10.1177/​014920639602200405.
Google Scholar
Hayslip, Bert, Jr., Laura McCoy-Roberts, and Robert J. Pavur. 1999. “Selective Attrition Effects in Bereavement Research: A Three-Year Longitudinal Analysis.” OMEGA - Journal of Death and Dying 38 (1): 21–35. https:/​/​doi.org/​10.2190/​68wt-eldf-fwgt-ytm7.
Google Scholar
Lepkowski, James, and Mick Cooper. 2002. “Nonresponse in the Second Wave of Longitudinal Household Surveys.” In Survey Nonresponse, edited by Robert Groves, Don Dillman, John Eltinge, and Roderick Little, 259–72. Wiley Series in Probability and Statistics. New York: Wiley–Blackwell.
Google Scholar
Lugtig, Peter. 2014. “Panel Attrition: Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers’.” Sociological Methods & Research 43 (4): 699–723. https:/​/​doi.org/​10.1177/​0049124113520305.
Google Scholar
Lugtig, Peter, Marcel Das, and Annette Scherpenzeel. 2014. “Nonresponse and Attrition in a Probability-Based Online Panel for the General Population.” In Online Panel Research: A Data Quality Perspective, 135–53. https:/​/​doi.org/​10.1002/​9781118763520.ch6.
Google Scholar
Lynn, Peter, Nicholas Buck, Jonathan Burton, Annette Jäckle, and Heather Laurie. 2005. A Review of Methodological Research Pertinent to Longitudinal Survey Design and Data Collection. Institute for Social and Economic Research, ISER Working Papers.
Google Scholar
McCurdy, Thomas, Thomas Mroz, and Mark R. Gritz. 1998. “An Evaluation of the National Longitudinal Survey on Youth.” Journal of Human Resources 33 (2): 345–436. https:/​/​doi.org/​10.2307/​146435.
Google Scholar
Roscoe, A. Marvin, Dorothy Lang, and Jagdish N. Sheth. 1975. “Follow-Up Methods, Questionnaire Length, and Market Differences in Mail Surveys.” Journal of Marketing 39 (2): 20–27. https:/​/​doi.org/​10.2307/​1250111.
Google Scholar
Russell, Stephen T. 2002. “Childhood Developmental Risk for Teen Childbearing in Britain.” Journal of Research on Adolescence 12 (3): 305–24. https:/​/​doi.org/​10.1111/​1532-7795.00035.
Google Scholar
Uhrig, Noah S.C. 2008. “The Nature and Causes of Attrition in the British Household Panel Study’.” In ISER Working Paper Series 2008–05. Institute for Social and Economic Research. https:/​/​ideas.repec.org/​p/​ese/​iserwp/​2008-05.html.
Google Scholar
Voorpostel, Marieke. 2010. “Attrition Patterns in the Swiss Household Panel by Demographic Characteristics and Levels of Social Involvement.” Swiss Journal of Sociology 36 (2): 359–77.
Google Scholar
Voorpostel, Marieke, and Oliver Lipps. 2011. “Attrition in the Swiss Household Panel: Is Change Associated with Drop-Out?” Journal of Official Statistics 27 (2): 301–18.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system