Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:54866/feed
Articles
Vol. 11, Issue 2, 2018January 22, 2018 EDT

Giving Respondents Voice? The Feasibility of Voice Input for Mobile Web Surveys

Melanie Revilla, Mick P. Couper, Carlos Ochoa,
web surveysmobile devicesvoice inputopt-in panel
https://doi.org/10.29115/SP-2018-0007
Survey Practice
Revilla, Melanie, Mick P. Couper, and Carlos Ochoa. 2018. “Giving Respondents Voice? The Feasibility of Voice Input for Mobile Web Surveys.” Survey Practice 11 (2). https:/​/​doi.org/​10.29115/​SP-2018-0007.
Save article as...▾

View more stats

Abstract

More and more respondents are completing Web surveys on mobile devices. Some studies have found that responses to open narrative questions are affected by use of mobile devices, especially smartphones. Given the rise of voice input on smartphones in general, it is worth investigating the feasibility of voice input for answering open questions on smartphones. The goal of this research note is to study (1) the extent to which people already use voice input on their mobile devices (for “everyday” life), (2) the extent to which they would be willing to use voice input to answer open questions in Web surveys, and (3) the factors that may affect the willingness to use voice input. Based on data from a survey in Spain in 2016, we found that around half of our sample already uses voice input (at least sometimes) and that also around half of the sample is willing to use it to answer open narrative questions if this was possible. Several variables were found to have a significant association with the willingness to use voice input: the use of voice input in everyday life, the frequency of Internet access through a smartphone, and the perceived difficulty of answering the current survey.

Introduction

More and more respondents are completing Web surveys using mobile devices, mainly tablets and smartphones. Previous research has investigated the impact of the device used on the comparability and quality of the data obtained from open narrative questions. Several studies have found that nonresponse to open questions is higher on mobile devices (Lambert and Miller 2015; Mavletova 2013), completion times for such questions are longer (Mavletova 2013; Revilla and Ochoa 2016), and the length of open-ended responses is shorter (Lambert and Miller 2015; Lugtig and Toepoel 2016; Peterson et al. 2013; Struminskaya, Weyandt, and Bosnjak 2015; Wells, Bailey, and Link 2014). However, (Buskirk and Andrus 2014) found no significant differences in length by device, and (Antoun, Couper, and Conrad 2017) found smartphone respondents provided longer answers to open questions.

Based on the evidence suggesting potential problems with open questions on mobile devices, (Revilla and Ochoa 2016) suggested investigating the potential use of voice input functions to make it easier and quicker for respondents to provide their answers to open narrative questions on mobile devices. The goal of this research paper is to study (1) how often people use voice input on their mobile devices (for “everyday” life), (2) to what extent would they be willing to use voice input to answer open questions in Web surveys, and to (3) explore the factors associated with willingness to use voice input. This is a largely understudied area.

A search on data on the extent of general voice input use yielded only three studies. Parks Associates (2016) reported that 39% of smartphone owners use some sort of voice recognition software such as Siri or Google Now. A Google study (Callaham 2014) revealed that 59% of teens and 41% of adults reported using voice search more than once a day (see also Google 2014). (Bajarin 2016) reported that 65% of a consumer panel said that they had used Siri, Google’s “OK Google or voice search”, or Microsoft’s Cortana. None of these studies provide sufficient detail to evaluate these estimates. Voice input is also increasingly used in some instant messaging apps (e.g., WhatsApp) instead of typing, but again data are hard to find.

We are aware of no studies that have explored the issue of voice input in the survey context. However, (Cape 2015) tested the feasibility of video input, finding that about half of Survey Sampling International panelists were capable of doing a video for responding to an open question, and about half of those actually did so when asked. Schober and colleagues (Schober et al. 2015) examined voice input (relative to text input) in a survey experiment among iPhone users. They reported that voice users provided better quality data (rounded less, straight-lined less, and reported more information that was sensitive). However, respondents expressed stronger preference for text input than voice for answering mostly closed-ended questions.

Given the rise in voice input technology and the increased use of smartphones, the potential use of voice input for answering open questions on a mobile device seems an area ripe for exploration.

Method and Data

We used data from a survey implemented in Spain in September–October 2016 within the Netquest panel, an opt-in online panel (www.netquest.com). The target sample was restricted to panelists who had Internet access through both PC and smartphone. Cross quotas for age and gender were used to ensure that the distributions of these variables in the sample were similar to the ones observed in the full panel.

The survey contained a maximum of 69 questions. Answers were not mandatory. The full survey (in Spanish) can be found at http://ww2.netquest.com/respondent/glinn/mobile2016. A total of 1,476 respondents (i.e., 48.4% of those who started; 90.9% of those who answered the first main survey question) completed the survey and formed the focus of our analyses. We focused on two key questions:

  • VI1: How often do you use voice input on a smartphone or tablet? (“1-Never” to "5-Always)

  • VI2: If it would be possible in the surveys we send you to use the voice input option to answer to open questions, would you use it? (“Definitively yes”, “Probably yes”, “Probably no”, and “Definitively no”).

We looked at the answers to these two questions and at how these are related to each other and to other variables described in the following section. Given the nonprobability nature of the sample, all analyses used unweighted estimates.

Hypotheses

We expect people who already use voice input in their everyday life would be more interested in also using it in surveys. Furthermore, we expect that elderly people, less educated people, people using the Internet on a smartphone less frequently, and people who do not have Spanish as a mother tongue to have more difficulties in typing their answers to open questions on mobile devices, and thus to show more interest in the possibility of using the voice input option. For similar reasons, we expect respondents who reported that the current survey was difficult to answer and those in the highest quartile of estimated completion time (i.e., over 15 minutes) to show more interest in the possibility of using the voice input option. In addition, if respondents are answering from home and if they are alone while answering the survey, using voice input might be easier than at work or in the presence of others. We thus expect being alone while answering to have a positive impact on the stated interest in using voice input to answer open questions.

Results

Frequency of Use of Voice Input on Mobile Devices

First, we look at how often respondents use voice input on a smartphone or tablet. Table 1 shows the distribution of the answers to this question (VI1).

Table 1 Frequency of use of voice input.
How often do you use voice input% (N=1,469)
1. Never49.1
219.5
317.3
45.7
5. Daily8.4

Around half of the sample report never using voice input. Fewer than one in ten respondents (8.4%) report daily use of voice input. Still, this suggests that voice input is already being used at least to some extent on mobile devices.

Stated Use of Voice Input to Answer Open Questions

Second, we consider the willingness to use voice input to answer open question in surveys (VI2). Table 2 shows the distribution of responses.

Table 2 Willingness to use voice input to answer open questions.
Would you use voice input for open questions% (N=1,469)
Definitively yes12.7
Probably yes41.3
Probably no33.9
Definitively no12.0

Overall, the distribution is quite balanced, with a small shift to the positive side: A majority (54.1%) of respondents said they definitely or probably would use voice input to answer open questions if it was possible.

Relationships With Other Variables

To what extent is willingness to use voice input to answer open questions related to current use of voice input and to other variables described above? We first examine bivariate relationships using a series of chi-square tests and then use a multivariable regression.

Bivariate Analyses

Table 3 presents the row percentages and significance tests for willingness to use voice input. We collapsed the dependent variable (VI2) into two categories (combining definitely and probably yes to “would use” and definitely and probably no to “would not use”). Similarly, we collapsed current use of voice input (VI1) into never use (1) versus use (2-4).

Table 3 Percent willing to use voice input (VI) in surveys, by selected characteristics and chi-square tests.
VariablesValuesWould use VI (%)nchi-squareP-value
Use voice inputNever use40.871696.538.000
Use66.4747
GenderWomen53.78441.111.292
Men56.6541
Age18-2454.02854.236.375
25-3451.2402
35-4453.1426
45-5458.5241
54+58.4113
EducationSecondary or less58.73854.404.036
More than secondary52.51,082
Internet use on SmartphoneDaily54.81,3454.315.038
Less often45.1122
Mother tongue SpanishNo44.71034.033.045
Yes54.91361
Answer survey from homeNo52.84150.808.369
Yes55.41033
Alone while answering surveyNo56.35131.105.293
Yes53.5939
Difficulty answering the surveyNo55.013968.069.005
Yes37.367
Completion time<=15 minutes53.711072.590.107
>15 minutes58.8313

We find significant associations with willingness to use voice input for several variables. Specifically, those who currently use voice input on their smartphones are more willing to do so for open questions in surveys. Willingness is higher for those with lower education, frequent smartphone users, native Spanish speakers, and those who did not report difficulty answering the current survey. We expected that those who may benefit more from voice input (i.e., those who use the Internet on a smartphone less frequently, those who do not have Spanish as a mother tongue, and those who had more difficulty answering the survey) would show more interest in the possibility of using the voice input but found the opposite. This might be because these people do not see the voice input option as a way to help them answer more easily (as we expected) but instead as a new tool which may make it even more difficult for them to participate.

Regression Analyses

Finally, we move from bivariate to multivariate analyses and run a regression with VI2 as the dependenrt variable. We use the original 4-category variable but reverse-coded so that a high score (and thus positive coefficients) means greater willingness.

We use the same variables as in Table 3 as independent variables, except that we consider the original variables instead of the categorical variables created for Table 3 in the case of VI1 (1= “Never”, 5= “Daily”), education (1= “Less than once a month”, 6= “Daily”) and difficulty of answering the survey (1= “Very easy”, 4= “Very difficult”). Table 4 shows the coefficients and p-values for this linear regression analysis.

Table 4 Regression analysis with VI2 (1=“Definitely not”, 4=“Definitely yes”) as the dependent variable.
Independent variablesCoefficientsP-values
Use voice input (VI1).219.000
Men.040.381
Age.038.047
Education-.032.191
Freq. Int. use on Smartphone.093.005
Mother tongue Spanish.097.264
Answer survey from home.090.071
Alone while answering survey-.018.698
Difficult to answer the survey-.209.000
Long estimated completion times.090.092
Constant1.777.000

R2=.143; Adjusted R2=.136; N=1,309

When controlling for the effect of other variables in a multivariate analysis, several of the relationships found in the bivariate analyses (i.e., the use of voice input in everyday life, the frequency of Internet use on a smartphone, and the perceived difficulty of answering the current survey) remain significant. On the other hand, two variables (education and Spanish as a mother tongue) are no longer significant, and age shows a marginally significant positive relationship with willingness to use voice input for answering open questions. We also ran a logistic regression (1=willing, 0=not willing) and generally found similar results. Education is statistically significant (p=.023) in the logistic model, and age is marginally insignificant (p=.055), but the signs of the coefficients for all predictors remain the same.

Discussion

Overall, we are interested in the possibility of using a voice input option to facilitate the answers of respondents in Web surveys to open narrative questions. We used data from a Web survey implemented in an online opt-in panel in Spain in 2016, in which respondents were asked about their current use of voice input in everyday life and about their willingness to use voice input in a survey to answer open questions. All respondents had Internet access on smartphones.

We found that around half of the sample report using voice input on a mobile device (smartphone or tablet) at least sometimes, and a little more than half (54.1%) report being willing to use this option to answer open questions if possible. We found three variables that have a robust relationship with willingness to use voice input for answering open questions in surveys: the use of voice input in everyday life (the greater the use, the greater the willingness); the frequency of Internet access through a smartphone (the greater the frequency, the greater the willingness); and the perceived difficulty of answering the current survey (the greater the difficulty, the lower the willingness).

An important limitation of this study is that we only considered what respondents said they would do and had no information about actual use of voice input in a survey if allowed (and/or encouraged) to do so. Even though previous research has shown that stated willingness is a useful measure in its own right, especially if the goal is to examine reasons for and covariates of (un)willingness (Couper and Singer 2013; Couper et al. 2008), we expect that actual compliance rates may be lower than those based on expressed willingness. Given that we surveyed members of an opt-in panel who had Internet access on their smartphones, these estimates of voice input use and willingness are likely to be high relative to the general population.

More research is needed to explore the likely use of voice input in surveys, but also to explore how the use of voice input could affect data participation, data quality, and respondent satisfaction with the survey. Further research is also needed to explore the potential technical and respondent-related barriers to the use of voice input and to replicate these findings in other settings. The next step is to test the actual use of voice input among smartphone users to answer open questions in a survey if given the option and encouraged to do so.

References

Antoun, C., M.P. Couper, and F.G. Conrad. 2017. “Effects of Mobile versus PC Web on Survey Response Quality: A Crossover Experiment in a Probability Web Panel.” Public Opinion Quarterly 81 (5): 280–306.
Google Scholar
Bajarin, B. 2016. “The Voice-First User Interface Has Gone Mainstream: A Quantitative Study of Amazon’s Echo, Apple’s Siri and Google’s OK Google Shows Where and How They’re Being Used by Consumers.” http:/​/​www.recode.net/​2016/​6/​7/​11868200/​the-voice-first-user-interface-has-gone-mainstream.
Google Scholar
Buskirk, T.D., and C. Andrus. 2014. “Making Mobile Browser Surveys Smarter: Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone.” Field Methods 26 (4): 322–42.
Google Scholar
Callaham, J. 2014. “Google Study: Over Half of Teenagers Use Voice Search Daily.” 2014. http:/​/​www.androidcentral.com/​google-study-over-half-teenagers-use-voice-search-daily.
Cape, P. 2015. “Imagine the Possibilities: Using Video to Reinvigorate the Open Question.” In CASRO Digital Research Conference. Nashville, TN. http:/​/​www.casro.org/​?2015DRCFiles.
Google Scholar
Couper, M.P., and E. Singer. 2013. “Informed Consent for Web Paradata Use.” Survey Research Methods 7 (1): 57–67.
Google Scholar
Couper, M.P., E. Singer, F.G. Conrad, and R.M. Groves. 2008. “Risk of Disclosure, Perceptions of Risk, and Concerns about Privacy and Confidentiality as Factors in Survey Participation.” Journal of Official Statistics 24 (2): 255–75.
Google Scholar
Lambert, A.D., and A.L. Miller. 2015. “Living with Smartphones: Does Completion Device Affect Survey Responses?” Research in Higher Education 56 (2): 166–77.
Google Scholar
Lugtig, P., and V. Toepoel. 2016. “The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey: Effects on Survey Measurement Error.” Social Science Computer Review 34 (1): 78–94.
Google Scholar
Mavletova, A. 2013. “Data Quality in PC and Mobile Web Surveys.” Social Science Computer Review 31 (4): 725–43.
Google Scholar
Peterson, G., J. Mechling, J. LaFrance, J. Swinehart, and G. Ham. 2013. “Solving the Unintentional Mobile Challenge.” In The CASRO Online Research Conference. San Francisco, CA.
Google Scholar
Revilla, M., and C. Ochoa. 2016. “Open Narrative Questions in PC and Smartphones: Is the Device Playing a Role?” Quality and Quantity 50 (6): 2495–2513.
Google Scholar
Schober, M.F., F.G. Conrad, C. Antoun, P. Ehlen, S. Fail, A.L. Hupp, M. Johnston, L. Vickers, H.Y. Yan, and C. Zhang. 2015. “Precision and Disclosure in Text and Voice Interviews on Smartphones.” PLoS ONE 10 (6): e0128337.
Google Scholar
Struminskaya, B., K. Weyandt, and M. Bosnjak. 2015. “The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-Based General Population Panel.” Methods, Data, Analyses 9 (2): 261–92.
Google Scholar
Wells, T., J.T. Bailey, and M.W. Link. 2014. “Comparison of Smartphone and Online Computer Survey Administration.” Social Science Computer Review 32 (2): 238–55.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system