Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:12504/feed
Articles
Vol. 3, Issue 6, 2010November 30, 2010 EDT

Mixed-Mode Surveys and the Internet

Edith D de Leeuw,
survey practice
https://doi.org/10.29115/SP-2010-0030
Survey Practice
Leeuw, Edith D de. 2010. “Mixed-Mode Surveys and the Internet.” Survey Practice 3 (6). https:/​/​doi.org/​10.29115/​SP-2010-0030.
Save article as...▾

View more stats

Abstract

Mixed-Mode Surveys and the Internet

Mixed-mode surveys try to combine the best of all possible worlds by exploiting the advantages of different modes to compensate for their weaknesses (De Leeuw 2005). This idea is as old as the survey method itself. Already in 1788, Sir John Sinclair used a mixed-mode approach. Lacking funds for a full statistical census, Sinclair used a cost-effective mail survey among ministers of all parishes in the Church of Scotland. To achieve a high response Sinclair also used follow-up letters and finally ‘statistical missionaries’, who personally visited the late responders to hurry ministerial replies. It worked; Sinclair achieved a 100% response (Hacking 1990; Whithers 2010 at www.edina.uk). That such a sequential mixed-mode strategy, where one method follows upon another, is also highly successful in the 21st century is illustrated by the American Community Survey. To achieve a high response rate at affordable costs a mail survey is used with follow-up telephone interviews for nonrespondents. In the last stage, face-to-face interviews are conducted on a subsample of the remaining non-respondents (“American Community Survey Design and Methodology Report” 2009, Chapter 7).

Mail surveys are still a good survey tool, but now Internet surveys are frequently the mode of choice for many survey organizations. Internet methods are even more cost efficient and make large surveys affordable (e.g., Couper 2000; De Leeuw 2005; Smyth and Pearson 2011). But the Internet offers more advantages, such as the potential for using complex questionnaires and quick turn-around time. While acknowledging its immense potential, Couper (2000) pointed out that undercoverage due to limited Internet penetration and low response rates are the main disadvantages of Internet surveys. In 2010, the Pew Internet and American Life project estimated that 79% of American adults use the Internet. However, Internet usage is not evenly spread over all groups, with an underrepresentation of non-Hispanic blacks, the elderly, and those with a lower income, lower education, and living in rural places. Mixed-mode surveys involving the Web may offer a solution to these problems (Couper and Miller 2008).

Data Equivalence

When using a mixed-mode approach for contact and reminders only, but keeping the data collection itself restricted to a single mode like Sinclair did, there is no threat of mode effects on measurement. However, when different subgroups of respondents are surveyed with different modes of data collection, mode effects may influence the resulting data and differential measurement error may threaten the validity of the results (De Leeuw 2005). This places mixed-mode researchers between a rock and a hard place! One of the main reasons for switching modes is to reach respondents that cannot be contacted otherwise. We indeed do survey different respondents with different methods on purpose to reduce coverage and nonresponse error! But, how big is the threat of mode effects?

Mode Effects

In the past, extensive mode comparisons have been made for the traditional data collection methods: face-to-face interviews, telephone surveys, and self-administered paper mail questionnaires. De Leeuw (1992, Chapter 3) performed a meta-analysis of 67 articles and papers reporting mode comparisons. The resulting overview showed consistent but usually small differences between methods, suggesting a dichotomy of survey modes in those with and without an interviewer. Additional reasons for mode effects are the use of aural versus visual communication and whether an interviewer or the respondent controls the question-answer sequence (e.g., Dillman, Smyth, and Christian 2009, Chapter 8). How about new media like the Web?

Self-administered Paper Questionnaires versus Internet

The importance of the medium of administration for data quality has long been recognized in diagnosing and assessment. When computerized forms of tests were introduced, the American Psychological Association (1986, 18) explicitly stated that: “… the equivalence of scores from computerized versions should be established and documented before using norms of cutting scores obtained from conventional tests.” This led to numerous empirical comparisons between computerized and paper-and-pencil versions of well-known tests, which in turn resulted in quantitative summaries and meta-analyses (e.g., Mead and Drasgow 1993). In general, administration mode had no statistically significant effect (for an overview, see De Leeuw and Hox 2011). For Internet surveys this evidence of test equivalence is promising indeed; after more than three decades of investigation, computerized tests appear to be accepted as being valid and reliable alternatives to traditional methods, and online tests can be seen as a special case of computer-assisted testing (Epstein and Klinkenberg 2001). Still, there are differences between computerized and Internet administration. Computerized testing is usually done under very controlled conditions, while an Internet survey or online test may be completed from any number of locations at any time and relatively free of controls. Despite these differences, test data collected over the Internet and via paper-and-pencil appear to be largely equivalent (cf. De Leeuw and Hox 2011).

These results are promising for mixing Internet and other forms of self-administered questionnaires. In general, it seems reasonable to assume that respondents use the same psychological processes and metric when responding to Internet and other forms of self-administered questionnaires. Although the precise implementation of a web survey may play a role here too (e.g., is skipping a question or a do-not-know allowed, can one page forward or backward, cf. Couper 2008).

Interviews versus Internet Surveys

There are fewer comparisons of Internet with interview surveys, either telephone or face-to-face, and as a consequence there are – as yet – no comprehensive meta-analyses summarizing mode effects for Internet versus interview surveys. However, one effect is consistently found in the available studies: Internet surveys appear to give rise to less social desirability than interviews. In this sense, Internet surveys are indeed more like self-administered questionnaires and share their benefits as Couper (2008) postulated. For instance Link and Mokdad (2005) in the USA found more self-reported heavy drinkers in a web survey compared to those in a telephone interview. This result remained strong and significant after adjusting for different demographic characteristics of respondents in both modes. Kreuter, Presser, and Tourangeau (2008) confirmed and extended these findings. Internet administration increased reporting of sensitive information amongst alumni in the USA. Kreuter, Presser, and Tourangeau (2008) had also access to record data and found a higher accuracy in web surveys to factual questions on educational achievement (e.g., GPA); they report that web surveys increased both the level of reporting sensitive information and the accuracy compared to CATI with the more private self-administered telephone survey (IVR) in between. Similar results have been found in Europe (for an overview, see De Leeuw and Hox 2011).

Conclusion

Empirical mode comparisons show relatively small differences between Internet and other modes of data collection, with the exception of Internet and interview mixes for sensitive questions. For a detailed review, see De Leeuw and Hox (2011). This seems reassuring, but mode comparison experiments usually take extreme care in designing and implementing equivalent questionnaires. In daily survey practice, differences in question wording and question format between specific modes may be a bigger threat than mode effects. It is advisable to use equivalent questionnaires in a unified mode design as far as possible (Dillman, Smyth, and Christian 2009, 326). A good example of a unified-mode design is the short form of the US Census (Martin et al. 2007).

References

“American Community Survey Design and Methodology Report.” 2009. Chapter 7. http:/​/​www.census.gov/​acs/​www/​methodology/​methodology_main/​.
American Psychological Association. 1986. Guidelines for Computer-Based Tests and Interpretations. Washington, DC: American Psychological Association (APA).
Google Scholar
Couper, M.P. 2000. “Web Surveys: A Review of Issues and Approaches.” Public Opinion Quarterly 64:464–94.
Google Scholar
———. 2008. Designing Effective Web Surveys. New York: Cambridge University Press.
Google Scholar
Couper, M.P., and P.V. Miller. 2008. “Web Survey Methods: Introduction to the Special Issue of POQ on Web Survey Methods.” Public Opinion Quarterly 72 (5): 831–35. http:/​/​poq.oxfordjournals.org/​content/​vol72/​issue5/​#ARTICLES.
Google Scholar
De Leeuw, E.D. 1992. “Data Quality in Mail, Telephone, and Face-to-Face Surveys.” Amsterdam: TT-Publikaties. http:/​/​www.xs4all.nl/​~edithl.
———. 2005. “To Mix or Not to Mix Data Collection Modes in Surveys.” Journal of Official Statistics (JOS) 21:233–55. http:/​/​www.jos.nu/​Articles/​abstract.asp?article=212233.
Google Scholar
De Leeuw, E.D., and J.J. Hox. 2011. “Internet Surveys as Part of a Mixed Mode Design.” In Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies, edited by M. Das, P. Ester, and L. Kaczmirek, 45–76. New York: Taylor & Francis Group.
Google Scholar
Dillman, D.A., J.D. Smyth, and L.M. Christian. 2009. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, New Jersey: Wiley.
Google Scholar
Epstein, J., and W.D. Klinkenberg. 2001. “From Eliza to Internet: A Brief History of Computerized Assessment.” Computers in Human Behavior 17:295–314.
Google Scholar
Hacking, I. 1990. The Taming of Chance. Cambridge, UK: Cambridge University Press.
Google Scholar
Kreuter, F., S. Presser, and R. Tourangeau. 2008. “Social Desirability Bias in Cati, Ivr, and Web Surveys. The Effect of Mode and Question Sensitivity.” Public Opinion Quarterly 72 (5): 847–65. http:/​/​poq.oxfordjournals.org/​content/​vol72/​issue5/​#ARTICLES.
Google Scholar
Link, M.W., and A.H. Mokdad. 2005. “Effects of Survey Mode on Self-Reports of Adult Alcohol Consumption: A Comparison of Mail, Web and Telephone Approaches.” Journal of Studies on Alcohol, 239–45.
Google Scholar
Martin, E., J.H. Childs, T. DeMaio, J. Hill, C. Reiser, E. Gerber, K. Styles, and D. Dillman. 2007. “Guidelines for Designing Questionnaires for Administration in Different Modes.” In US Bureau of the Census. Washington DC. http:/​/​www.census.gov/​srd/​mode-guidelines.pdf.
Google Scholar
Mead, A.D., and F. Drasgow. 1993. “Equivalence of Computerized and Paper-and-Pencil Cognitive Ability Tests: A Meta-Analysis.” Psychological Bulletin 114 (3): 449–58.
Google Scholar
Pew Internet and American Life Project. 2010. “Demographics of Internet Users.” 2010. http:/​/​www.pewinternet.org/​Trend-Data/​Internet-Adoption.aspx.
Smyth, J.D., and J.E. Pearson. 2011. “Internet Survey Methods: A Review of Strengths, Weaknesses, and Innovations.” In Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies, edited by M. Das, P. Ester, and L. Kaczmirek, 11–44. New York: Taylor & Francis Group.
Google Scholar
Whithers, C.W.J. 2010. “Scotland Accounted for: An Introduction to the Old (1791–1799) and the New (1834–1845) Statistical Accounts of Scotland. EDINA, The Statistical Accounts of Scotland.” 2010. http:/​/​edina.ac.uk/​stat-acc-scot/​reading/​.

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system