Filling the Void: Gaining a Better Understanding of Tablet-based Surveys

Tom Wells Nielsen

Justin T. Bailey NPD Group

Michael W. Link Nielsen

Online Surveys on Mobile Devices

“Survey respondents are increasingly attempting to take surveys on their mobile devices, whether researchers intend for this or not” (Cazes et al. 2011, p. 2). Approximately 50 percent of US adults own a smartphone (Nielsen 2012; Smith 2012), and approximately 20 percent of US adults own a tablet (Rainie 2012).

These trends have serious implications for online surveys, especially for online surveys that are designed specifically for a computer screen and not modified, or optimized, for the smaller screen typical of a mobile device. In this paper, we present results from tablet, computer, and smartphone administrations of a survey. For each, we examine three measures of survey taking behavior. Our main focus is on surveys taken with tablets and whether tablet survey administration is comparable to computer survey administration. Our results are preliminary, but instructive, since there is currently very little research on tablet administration of online surveys. However, with tablet ownership on the rise, understanding the effects of this survey mode will become exceedingly more important. Just as tablets have served to fill the void between the often difficult-to-read smartphone screen and the difficult-to-transport computer, tablets can also fill the void for mobile survey takers.

Previous Research

Online surveys taken on mobile devices can present problems. Perhaps the most serious is survey breakoff. Previous research on mobile web surveys (typically those not optimized for mobile devices) has reported breakoff rates in the range of 25–70 percent (Callegaro 2010, Callegaro and Macer 2011).

Similarly, Peterson (2012) reports that unintended mobile respondents breakoff twice as often and take 25–50 percent longer to complete online surveys relative to computer respondents. However, his research summary focuses on unintended mobile respondents taking surveys on smartphones, not tablets.

Currently, a very small percentage of respondents (about 1 percent) are taking online surveys on tablets (Callegaro and Macer 2011; Guidry 2012; McClain et al. 2012), and very little research exists on tablet administration of online surveys.

In one of the few studies to address this, Guidry (2012) analyzed data from the National Survey of Student Engagement (NSSE), an annual online survey of undergraduate students. In 2012, 3.8 percent of NSSE respondents took the online survey on a smartphone and 0.4 percent took it on an iPad. (No other types of tablets were used.) Guidry found that iPad respondents had similar abandonment rates as computer respondents (and much lower rates than smartphone respondents), similar rates of item-missing data, and similar rates of response non-differentiation (and much lower rates than smartphone respondents).

In this paper, we add to this nascent research by comparing tablet, computer, and smartphone administrations of a survey among a national sample of adults.

Current Study

One of the original objectives of this study was to test mobile phone surveys versus surveys done on a computer. For the mobile survey, we utilized a smartphone survey app–the Survey on Demand App (SODA), developed by Techneos (a Confirmit company). The survey app has been programmed for all major types of smartphone operating systems, with a separate optimized visual design for each. See Buskirk and Andrus (2012) for a discussion of this app-based smartphone survey approach.

In this study, the same survey was administered to smartphone respondents and online respondents. The questionnaire contained 24 questions on consumer behavior, Internet usage, and TV viewing habits. The survey was designed primarily with mobile app respondents in mind. It featured short questions, short response lists, no grid items, minimal need for vertical scrolling, and was relatively short.

The survey was fielded to a large, national sample of online panelists from KnowledgePanel®–the probability-based, online panel maintained by Knowledge Networks (a GfK company). For the mode effect research being conducted, the sample was restricted to smartphone users (to avoid confounding survey mode with respondent characteristics). Panelists were pre-screened 1 week prior to the survey.1 Of the 2443 eligible smartphone users, 1254 were randomly assigned to take the survey on their smartphone, via the mobile app, and 1187 were randomly assigned to take the survey online, on a computer (as they usually do).

Those assigned to the mobile app mode were emailed instructions to download and install the survey app to their smartphone and were provided a survey code to start the survey. This second step was taken to ensure that only those assigned to the mobile app survey could access it. Those assigned to the online mode were sent email invitations which contained a link to the survey and were instructed to complete the survey on a PC or laptop. A total of 732 panelists responded to the mobile app survey and 725 responded to the online survey, representing survey participation rates of 58 percent and 61 percent, respectively.

We received a total of 705 completed mobile app surveys and 711 completed online surveys. Tables 1 and 2 present the modes and platforms actually used to complete the survey. Among those randomly assigned to the online mode, 128 of the panelists completed the survey on a smartphone, rather than on a computer (as instructed). We also identified 33 unintended mobile respondents who completed the survey with a tablet, and more specifically, an iPad. No other types of tablets were used to take the survey.

Table 1 Survey assignment and survey completion modes.

Survey Completion Mode Survey Assignment Mode
Mobile app PC web
Mobile app 705
PC web 550
Mobile web – smartphone 128
Mobile web – iPad 33
Did not complete 549 476
Total 1254 1187

Table 2 Completed surveys by survey administration mode and platform.

Mode and Platform Number of Completed Surveys Percentage of Completed Surveys
Mobile app 705 49.8%
 Android 324 22.9%
 BlackBerry 72 5.1%
 iPhone 299 21.1%
 Other* 10 0.7%
PC web 550 38.8%
Mobile web 161 11.4%
 Android 48 3.4%
 iPhone 70 4.9%
 iPad 33 2.3%
 Other** 10 0.7%
Total 1416 100.0%

*Included in this category: one Nokia user, one Windows mobile user, eight with inconsistent data on type of mobile device. These respondents are not included in the main analysis.

**Included in this category: six BlackBerry users, three iPod users, one other, n.e.c. These respondents are not included in the main analysis.

1Other, n.e.c. These respondents are not included in the main analysis.

These panelists accessed the survey by opening the email invitation on their mobile device. Fortunately, among the paradata collected, we collected user agent string, which identifies the type of browser and device used to access the survey.

As shown in Table 2, the majority of the unintended mobile respondents completed the survey with Android and iPhone smartphones. This occurred despite the fact the survey was not optimized and intended for smartphone mobile web administration. This situation is described by Buskirk and Andrus (2012) as the “passive-mobile browser survey approach” and it entails many disadvantages.

In terms of demographic characteristics, iPad respondents were significantly more likely than other respondents to have at least a Bachelor’s degree, household income of at least $75,000, to be married, and to be homeowners. Not surprisingly, 39 percent report they primarily use a tablet to access the Internet (compared to 15 percent of others).

On the other hand, the 128 unintended smartphone respondents were significantly more likely than other respondents to be young, female, to reside in larger households, and to access the Internet primarily with their smartphone (61 percent vs. 26 percent of others).

Presented in Figure 1A–1D are screenshots taken from smartphone, tablet, and computer administrations of the survey.

Figure 1(A–D) Screenshots of mobile app, mobile web, tablet, and computer survey administrations.



In our analysis, we examine three measures of survey taking behavior – breakoff rates, survey completion times, and item-missing data – among tablet respondents, computer respondents, and smartphone respondents.

Survey Breakoff

As shown in Table 3, breakoff rates for the survey were quite low, across all modes and platforms. However, the breakoff rates for the mobile web respondents were noticeably higher, consistent with findings reported by Peterson (2012). Within this group of unintended mobile respondents, the breakoff rate for iPad respondents was about half of that for Android and iPhone smartphone respondents, consistent with findings from Guidry (2012).

Test for differences in breakoffs by survey administration mode and platform, we estimated a logistic regression equation. This multivariate analysis allows us to predict the odds of breakoff by mode and platform while statistically controlling for the demographic characteristics of respondents. Consistent with the patterns displayed in Table 3, the regression results reveal that mobile app respondents and smartphone web respondents (both Android and iPhone respondents) were significantly more likely to breakoff than computer respondents. On the other hand, there was no significant difference in the odds of breakoff between iPad respondents and computer respondents.

Table 3 Breakoffs by survey administration mode and platform.

Mode and Platform Number of Breakoffs Breakoff Rate
Mobile app 27 3.7%
PC web 5 0.9%
Mobile web 9 5.3%
 Android 3 5.9%
 iPhone 5 6.7%
 iPad 1 2.9%
Total 41 2.8%

Note: We were unable to collect any survey data or paradata from mobile app respondents who did not complete the survey. Thus, we cannot calculate breakoff rates by mobile app platform.

Survey Completion Time

Summary statistics for survey completion times are presented in Table 4. In general, respondents completed the survey in a median time of about 5.5 minutes. This was shorter than anticipated for a 24-question survey, but in part, can be explained by the use of short questions and short sets of response options. Extreme outliers inflate the values of means and standard deviations, but they are presented for the sake of the interested reader.

Table 4 Completion times by survey administration mode and platform.

Mode and Platform Completion Time (minutes)
Median Mean St. Dev. Range
Mobile app 5.5 31.3 341.8 1.8–7133.0
 Android 5.5 19.0 190.6 1.8–3415.0
 BlackBerry 5.6 10.8 36.0 2.2–310.0
 iPhone 5.5 50.4 485.5 2.1–7133.0
PC web 5.3 16.2 77.3 1.9–1121.9
Mobile web 8.2 56.6 207.8 2.7–1417.6
 Android 8.0 45.3 179.4 2.7–1143.2
 iPhone 10.2 83.2 269.8 4.5–1417.6
 iPad 5.1 18.6 68.7 3.0–400.7
Total 5.6 28.3 255.9 1.8–7133.0

Focusing on median completion time, mobile web respondents required much more time than others to complete the survey, consistent with findings from Peterson (2012). However, iPad respondents completed the survey in a median time of 5.1 minutes. In contrast, those completing the mobile web survey using Android and iPhone smartphones took much longer – 8.0 minutes and 10.2 minutes, respectively. This is not surprising given that the survey was not optimized for smartphone web administration. The questions and text appeared very small on a smartphone browser. Reading the small print, zooming, and selecting among small radio buttons and check boxes required more time (and increased respondent burden).

To test for differences in survey completion time by mode and platform, we estimated an ordinary least squares regression equation.2 Compared to computer respondents, survey completion time is significantly longer among Android and iPhone mobile app respondents (but not among BlackBerry respondents). Similarly, regression results reveal that Android and iPhone mobile web respondents take significantly more time to complete the survey than computer respondents. However, once again, significant differences were not uncovered between iPad respondents and computer respondents.

Item Non-Response

Finally, we consider item non-response across the different survey modes and platforms. Presented in Table 5 are percentages of respondents who skipped at least one question in the survey.3

Table 5 Item non-response by survey administration mode and platform.

Mode and Platform Percentage
Mobile app 12.3%
 Android 9.6%
 BlackBerry 20.8%
 iPhone 12.7%
PC web 10.7%
Mobile web 8.1%
 Android 4.2%
 iPhone 10.0%
 iPad 9.1%
Total 11.2%

With the computer and mobile app administrations, approximately 10–13 percent of respondents did not respond to at least one item in the survey. The percentage is about double among BlackBerry respondents, despite the fact that the mobile app survey was also optimized for BlackBerry devices.

With the mobile web administration, approximately 8–10 percent of respondents did not respond to at least one question in the survey. Interestingly, Android mobile web respondents were much less likely to skip survey items than others, although it is not clear why.

Again, we estimate a logistic regression equation, in this case, to predict the odds of skipping at least one question in the survey. Controlling for demographic factors, BlackBerry respondents were significantly more likely than computer respondents to not answer at least one survey question. However, no other significant differences in item non-response by mode or platform were uncovered, consistent with findings from Guidry (2012).


Based on the descriptive and multivariate analyses across the three measures examined, tablet survey administration appears to be comparable to computer survey administration. Across each measure, differences in survey taking behaviors were small and were not statistically significant, consistent with findings from Guidry (2012).

At the same time, with two of the measures – breakoff rates and survey completion time – we consistently uncovered differences between smartphone administration and computer administration. Not surprisingly, differences are more pronounced among smartphone web respondents.

These are intriguing, but preliminary results and conclusions, as they are based on a small and self-selected group of tablet respondents. In addition, our results only apply to iPad respondents, since no other types of tablets were used to complete the survey. Still, this provides initial evidence that tablets can fill the void between traditional online surveys and those taken on a mobile device. While tablets are not as widely-used as smartphones, they share some of the common characteristics, such as portability and touchscreen design. These features can be leveraged on the tablet when designing surveys.

However, more research needs to be done to understand fundamental behavioral differences of people when they use smartphones and tablets. For example, while tablets are no doubt more portable than most computers, they likely are not used in the same way that smartphones are used. Instead, tablet usage might take on more characteristics of traditional online surveys. That is, they might be more commonly used when the respondent is seated, focused, and single-tasking, rather than the on-the-go, multi-tasking behaviors of smartphone users. Understanding the key differences and similarities of smartphone behaviors and tablet behaviors will play a critical role in survey design for both modes.

Furthermore, tablets are not often considered to be cellular devices, like smartphones. Although some tablets have cellular capabilities, they are not used as a communication device in the same way a smartphone is. Voice calling and text messaging on tablets are not as common. This distinction further differentiates smartphones and tablets and contributes to the differences in how both are used.

Thus, we encourage additional research on tablet survey administration. Currently, large-scale online surveys (100,000+ respondents) may yield enough tablet respondents to make more firm conclusions despite the low tablet penetration worldwide. Tablets provide a unique niche between smartphones and personal computers, and this research is an early attempt to better define how tablets can be used as survey tools.


Buskirk and Andrus 2012
Buskirk, T.D. and C. Andrus. 2012. Smart surveys for smart phones: exploring various approaches for conducting online mobile surveys via smartphones. Survey Practice. Available at
Callegaro 2010
Callegaro, M. 2010. Do you know which device your respondent has used to take your online survey? Survey Practice December.
Callegaro and Macer 2011
Callegaro, M. and T. Macer. 2011. Designing surveys for mobile devices: pocket-sized surveys that yield powerful results. Short course presented at the Annual Meeting of the American Association for Public Opinion Research, Phoenix, AZ.
Cazes et al. 2011
Cazes, J., L. Townsend, H. Rios and J. Hughes. 2011. Evolving best practices in mobile surveys and online administration. [Kinesis Survey Technologies Whitepaper.] Available at:
Guidry 2012
Guidry, K.R. 2012. Response quality and demographic characteristics of respondents using a mobile device on a web-based survey.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Orlando, FL.
McClain et al. 2012
McClain, C.A., S.D. Crawford and J.P. Dugan. 2012. Use of mobile devices to access computer-optimized web nstruments: implications for respondent behavior and data quality. Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Orlando, FL.
Nielsen 2012
Nielsen. 2012. America’s new mobile majority: a look at smartphone owners in the U.S. Available at:
Peterson 2012
Peterson, G. 2012. Unintended mobile respondents. Paper presented at the Annual Council of American Survey Research Organizations Technology Conference, New York, NY.
Rainie 2012
Rainie, L. 2012. Tablet and e-book reader ownership nearly double over the holiday gift-giving period. [Pew Internet and American Life Project Report.] Available at:
Smith 2012
Smith, A. 2012. Nearly half of American adults are smartphone owners. [Pew Internet and American Life Project Report.] Available at:
1 A total of 25,221 active panelists were sent the smartphone screener survey. A total of 10,156 responded over a 2-day period. Of those, 2,443 were identified as smartphone owners and those willing to complete a survey on their smartphone.
2 To account for the pronounced positive skew in survey completion time, we used the natural logarithm of completion time as the dependent variable. In addition, for the OLS analysis, weremoved outliers – the 5 percent completing the survey in less than 3.0 minutes and the 5 percent completing the survey in more than 26.3 minutes.
3 Note that this is a respondent-level measure, not a question-level measure of item non-response. Item non-response across each of the 24 questions was 1 percent or less. This is true across all modes and platforms.

About Survey Practice Our Global Partners Disclaimer
The Survey Practice content may not be distributed, used, adapted, reproduced, translated or copied for any commercial purpose in any form without prior permission of the publisher. Any use of this e-journal in whole or in part, must include the customary bibliographic citation and its URL.