Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:25822/feed
Articles
Vol. 16, Issue 1, 2023June 30, 2023 EDT

What paradata can tell us beyond response rates: exploring device use and push-to-web access methods in a survey of emergency department patients

Megan Mathews, Layla Parast, Marc Elliott, Stephanie Williamson, Sarah Joestl, Debra Stark, William Lehrman, Kirsten Becker,
paradataweb surveymixed mode
https://doi.org/10.29115/SP-2023-0006
Photo by Creative Christians on Unsplash
Survey Practice
Mathews, Megan, Layla Parast, Marc Elliott, Stephanie Williamson, Sarah Joestl, Debra Stark, William Lehrman, and Kirsten Becker. 2023. “What Paradata Can Tell Us beyond Response Rates: Exploring Device Use and Push-to-Web Access Methods in a Survey of Emergency Department Patients.” Survey Practice 16 (1). https:/​/​doi.org/​10.29115/​SP-2023-0006.
Save article as...▾

View more stats

Abstract

With the general decline in survey response rates and increased utilization of mobile computing devices, many surveys are employing mixed-mode protocols with web components in order to reduce costs and increase response rates. Using paradata generated from a web survey instrument administered to an emergency department (ED) patient population after discharge, we examined the device types and methods used to access and complete the web survey and the association of these factors with the method of web survey invitation. Of the 1,581 patients who accessed the web survey, 89% completed the survey by web. Smartphone completions (59%) outnumbered computer completions (35%); tablets (5%) were the least-used device. Including a text message notification in the contact protocol resulted in the highest percentage of smartphone completions: 69% to 73% of web respondents in experimental arms with text outreach completed the survey by smartphone, compared to 53% to 58% of web respondents in arms in which email was the only form of electronic invitation. Compared to computer and tablet users, smartphone users tended to be younger, less educated, in poorer self-rated health, and be members of racial and ethnic minority groups who are often underrepresented in healthcare surveys. The results presented here have implications for surveys that include web in a mixed mode protocol by highlighting the necessity of a mobile-optimized design that accommodates smartphone users, and the potential impact of web invitations and reminder protocols.

Introduction

Response rates for population surveys have been steadily declining, raising concerns among researchers and policy makers about the future of survey research (Czajka and Beyler 2016; National Research Council of the National Academies 2013). Seeking to stanch this trend, remain at the cutting edge of survey research, and work within the constraints of ever-shrinking budgets, an increasing number of federal surveys have explored the use of web components in their data collection efforts (Baumgardner, Doborah, and Raglin 2014; Bergeson et al. 2013; Labovitz, Patel, and Santander 2017; U.S. Census Bureau 2015).

When adding a web component to a survey protocol, survey researchers should not only consider respondents’ preferential device choice for web-based survey completion, but the method(s) of survey notification. How respondents receive the web survey invitation and reminder(s) may influence whether they complete the survey and the nature of their responses. Understanding how the mode of invitation to participate in a web survey influences respondent behavior throughout the survey process is critical in survey planning.

Paradata refers to auxiliary process information acquired during a survey’s data collection process (West 2011). In the case of web-based surveys, paradata can provide insight into web users’ preferences, such as device type, access method, number of access attempts, and time spent on each question—this valuable information can inform survey design and analysis. While paradata was originally used primarily for interviewer performance monitoring, additional uses have emerged (Lynn and Nicolaas 2010). For example, researchers have used paradata to explore differential response tendencies in ‘check-all-that-apply’ item response option sequencing (Stern, Bilgen, and Rookey 2012) and to estimate future survey cost (Wagner 2019). In addition, survey methodologists have examined and quantified the impact of mixed device use, finding different rates of nonresponse, different lengths of time spent on items, and differences in number of access attempts depending on device type (Bosnjak, Bauer, and Weyandt 2017; Buskirk and Andrus 2014; Callegaro 2010).

Using the paradata acquired from an emergency department (ED) experience of care survey in which a web component was the initial mode in several mixed-mode protocols, we aimed to answer the questions (1) how does the method of invitation to a web survey (mailed letter with URL; e-mail; text) affect device use, (2) what is the association between how a respondent accessed the survey and their device selection, (3) what, if any, respondent characteristics differ by device type used, (4) how many web survey invitations are needed?

Our contributions to the literature are threefold. First, we focus on a unique population— ED patients discharged home after receiving care—in which, to our knowledge, an extensive examination of paradata is currently unavailable. Second, we tested and compared nine different complex protocols for survey administration, eight of which included a web component, which gives us the ability to describe observed differences in paradata under multiple conditions. Third, our study had a relatively large sample size which allows us to statistically examine differences in respondent characteristics by device type. These contributions have the potential to inform future web survey development and utilization, particularly, though not exclusively, in health care-related surveys.

Methods

This research is based on a randomized experiment that sampled 26,991 ED patients discharged to the community (i.e., not admitted to the hospital or other care facility) between January and March 2018 from16 large, geographically dispersed hospital-based EDs and randomized patients within EDs to nine survey arms. The primary goal of the study was to examine response rates and compare utilization of various protocols across the survey arms. Eight of the nine arms included a web component option; arm 1 did not include a web component. Information about study design and results from the full study, including overall response rates by arm, protocol comparisons, comparisons of respondent characteristics by completion mode, and characteristics associated with the ability to email/text patients are presented elsewhere (Parast et al. 2019). Prior analyses of these data have demonstrated that inclusion of telephone mode increased response rates, that older patients and male patients were significantly less likely to be reached by email or text, and that telephone and web respondents were more representative of the sampled population demographic characteristics than mail respondents (Parast et al. 2019).

This paper focuses specifically on the patients who interacted with the web survey in the eight experimental arms with a web component, in which the sequence of paper, email, and text push-to-web notifications and reminders varied, as shown in Table 1. All of these protocols involved combinations of a mailed introduction letter (containing a URL link with or without a Quick Response [QR][1] code directed to the survey), a text message, and/or an email inviting or reminding prospective respondents to complete the web survey. Sampled patients who completed the survey via the initial web mode outreach(es) within the allotted time were then contacted via a subsequent non-web mode: mailed paper and/or telephone survey.

Table 1.Experimental Design Protocols by Survey Arm.
Contact Day Arm 1:
Mail + Phone (No Web Component)
Arm 2:
Paper Invite + Email (x2) + Mail (x2)
Arm 3:
Paper Invite + Email (x3) + Mail
Arm 4: Email (x4) + Phone Arm 5:
Email (x4) + Mail + Phone
Arm 6: Email (x4) + Mail (x2) Arm 7:
Text + Email + Text + Email + Mail (x2)
Arm 8:
Paper Invite + Text + Email (x2) + Mail
Arm 9:
Paper Invite (URL only) + Email (x2) + Mail (x2)
(RR: 25.5%) (RR: 15.5%) (RR: 13.4%) (RR: 22.9%) (RR: 27.3%) (RR: 15.3%) (RR: 17.6%) (RR: 14.4%) (RR: 15.4%)
1 Mailed Survey Mailed Letter (containing QR code & URL) Mailed Letter (containing QR code & URL) Emailed Link Emailed Link Emailed Link Texted Link Mailed Letter (containing QR code & URL) Mailed Letter (containing URL only)
2 Emailed Link Emailed Link Emailed Link Emailed Link
4 Emailed Link Emailed Link Emailed Link Texted Link
6 Emailed Link Emailed Link Emailed Link Emailed Link Emailed Link Emailed Link Texted Link Emailed Link
8 Emailed Link & Mailed Survey Emailed Link Phone Attempt 1 (5 max) Mailed Survey Mailed Survey Mailed Survey Emailed Link Emailed Link & Mailed Survey
10 Emailed Link Emailed Link
14 Mailed Survey Mailed Survey
22 Phone Attempt 1
(5 max)
Mailed Survey Phone Attempt 1
(5 max)
Mailed Survey Mailed Survey Mailed Survey

Notes: RR=Response Rate across all completed respondents (web, mail, and phone); QR=Quick Response (barcode read by cellphone); URL=Uniform Resource Locator (web address);
Contact day represents the days from first contract attempt (day =1).

All patients assigned to arms with a mailed paper invitation to the web survey (arms 2, 3, 8, and 9) had the opportunity to access the web survey.[2] Thirty percent of sampled patients provided an email address and consented to email contact; 59% provided a valid cell telephone number and consented to text contact. As such, patients assigned to a survey arm with only email and/or text invitations to the web survey (arms 4, 5, 6, and 7) who did not provide the appropriate contact information or did not consent did not have the opportunity to access the web survey.

A total of 16,157 eligible[3] sampled patients were provided the option to access the web survey (61.5% of the eligible sampled population); of those, 1,581 sampled patients interacted (i.e., interacted with it in some way, even if simply opening the web survey and viewing the introductory screen) with the web survey (9.8% of the population who could have accessed the survey).

Our analyses focus on web paradata for eligible sampled patients, particularly patients who interacted with the web survey (N=1,581) and those who completed[4] the survey by web (N=1,408, or 89.1% of those who interacted with the web survey). Confirmit, a web survey software, was used for web survey administration and collection of web paradata (Confirmit 2019). Paradata collected included the number of times the web survey was accessed, the device used to access the web survey (i.e., smartphone, tablet, computer), and the method and invitation/reminder by which the web survey was accessed (i.e., link in an email, link in a text, URL typed into browser, scanned QR code). Distributional frequencies of paradata were descriptively examined. Chi-squared tests were used to test for differences in web respondent self-reported characteristics (age, gender, race/ethnicity, primary language spoken at home, education, general health and overall mental health status, arrival by ambulance, and whether a proxy helped in completing the survey) by device type and access method used.

Results

Web survey participation

Of the 16,157 sampled patients who were given the opportunity to interact with the web survey, 1,581 (9.8%) interacted with the web survey. Of these 1,581 patients, 89% completed the survey by web, less than 1% completed by either mail or telephone during the non-web mode follow-up, and 9% did not complete the survey by any mode. Among the patients who interacted with the web survey but did not complete the survey, the majority (91%) did not engage the survey beyond the introduction screen.

How does the method of invitation to a web survey affect device use?

Table 2 displays the distribution of device type used among respondents who completed the survey by web across the eight arms involving a web component. Smartphones (44% to 73% of completions across the 8 arms and 59% of all completions) far outnumbered computers (24% to 49% of completions across the 8 arms; 36% of all completions), and tablets were the least-used mode (4% to 8% of completions; 5% overall). Text invitations (arms 7 and 8) to the web survey clearly boosted smartphone use (69% to 73% of completions) compared to paper and email web invitations (arms 2, 3, and 9; 44% to 53% of completions) or email only web invitations (arms 4, 5, and 6; 57% to 58% of completions). A paper invitation without a text component (arms 2, 3, and 9) resulted in the largest proportions of web completions by computer (42% to 49%) compared to email only web invitations (arms 4, 5, and 6; 35% to 38% of completions) or text and email invitations (arms 7 and 8; 24% to 27% of completions).

Table 2.Distribution of Device Type used among Completed Web Surveys, by Survey Arm, % (N).
Device Type Arm 2:
Paper Invite + Email (x2) + Mail (x2)
Arm 3:
Paper Invite + Email (x3) + Mail
Arm 4: Email (x4) + Phone Arm 5:
Email (x4) + Mail + Phone
Arm 6: Email (x4) + Mail (x2) Arm 7:
Text + Email + Text + Email + Mail (x2)
Arm 8:
Paper Invite + Text + Email (x2) + Mail
Arm 9:
Paper Invite (URL only) + Email (x2) + Mail (x2)
All Experimental Arms
Tablet 3.6% (5) 4.4% (8) 5.4% (8) 8.0% (13) 6.9% (12) 3.8% (9) 3.8% (8) 7.1% (11) 5.3% (74)
Computer 44.3% (62) 42.3% (77) 38.1% (56) 35.0% (57) 34.7% (60) 23.5% (56) 27.3% (57) 48.7% (76) 35.6% (501)
Smartphone 52.1% (73) 53.3% (97) 56.5% (83) 57.1% (93) 58.4% (101) 72.7% (173) 68.9% (144) 44.2% (69) 59.2% (833)

Notes: This table reflects descriptive statistics. This table includes all patients who accessed the web survey and completed the survey; N=1,408. One case had an unknown device type in Arm 3.

What is the association between how a respondent accessed the survey and their device selection?

Sixty-nine percent of web respondents who typed in the URL link completed the survey via a computer (results not shown). For all other access methods (email, text, and QR code), the majority (56% to 99%) completed the web survey via smartphone. Tablets were the least-chosen device for nearly all access methods, except for QR codes, where no web respondent completed by computer.

What, if any, respondent characteristics differ by device type used?

Significant differences in web respondent characteristics were identified by both the device type and access method used to complete the survey. Table 3 shows the distribution of web respondent characteristics by device type. Both computer and tablet completers were more likely to be older (55 and older), non-Hispanic white, more educated (with a bachelor’s degree or more), and have excellent or very good self-reported mental health than smartphone completers. Computer completers also had a more even distribution of male and female respondents than smartphone and tablet completers, who were largely female.

Table 3.Distribution of Web Respondent Characteristics, by Device Type, % (N).
Respondent Characteristics Category Computer
N=501
Smartphone
N=833
Tablet
N=74
Age *** 18-24 4.2% (21) 9.3% (77) 0.0% (0)
25-34 6.2% (31) 16.8% (140) 0.0% (0)
35-44 12.2% (61) 21.3% (177) 6.8% (5)
45-54 12.0% (60) 19.2% (160) 9.5% (7)
55-64 23.0% (115) 15.1% (126) 28.4% (21)
65-74 22.8% (114) 11.8% (98) 33.8% (25)
75-84 15.0% (75) 5.5% (46) 8.1% (6)
85 + 4.8% (24) 1.0% (8) 13.5% (10)
Gender *** Female 57.9% (290) 71.2% (592) 71.6% (53)
Male 42.1% (211) 28.8% (240) 28.4% (21)
Race/ethnicitya *** White 83.6% (419) 67.3% (560) 89.2% (66)
Hispanic 5.8% (29) 11.5% (96) 2.7% (2)
Black 3.6% (18) 10.0% (83) 4.1% (3)
Asian or Pacific Islander 1.4% (7) 2.9% (24) 1.4% (1)
Native American 0.4% (2) 0.8% (7) 0.0% (0)
Multiracial 1.6% (8) 2.8% (23) 0.0% (0)
Unknown 3.6% (18) 4.7% (39) 2.7% (2)
Primary language spoken at home English 94.8% (475) 90.9% (756) 95.9% (71)
Spanish 0.6% (3) 2.5% (21) 1.4% (1)
Some other language 1.6% (8) 2.4% (20) 1.4% (1)
Unknown 3.0% (15) 4.2% (35) 1.4% (1)
Educational attainment  *** 8th grade or less 1.0% (5) 2.0% (17) 1.4% (1)
 Some high school, but did not graduate 1.4% (7) 3.5% (29) 2.7% (2)
High school or GED 16.6% (83) 19.8% (165) 17.6% (13)
Some college or 2-year degree 32.1% (161) 39.7% (330) 20.3% (15)
4-year college graduate 19.8% (99) 16.0% (133) 27.0% (20)
More than 4-⁠year college degree 26.7% (134) 15.0% (125) 29.7% (22)
Unknown 2.4% (12) 4.0% (33) 1.4% (1)
Did someone help you complete this survey?  No 94.0% (471) 91.7% (763) 95.9% (71)
Yes 4.2% (21) 4.3% (36) 2.7% (2)
Unknown 1.8% (9) 4.0% (33) 1.4% (1)
Self-rated overall general health Poor 4.6% (23) 4.8% (40) 6.8% (5)
Fair 17.0% (85) 17.4% (145) 10.8% (8)
Good 30.9% (155) 32.0% (266) 35.1% (26)
Very Good 35.9% (180) 29.4% (245) 31.1% (23)
Excellent 10.2% (51) 13.1% (109) 14.9% (11)
Unknown 1.4% (7) 3.2% (27) 1.4% (1)
Self-rated overall mental health * Poor 2.0% (10) 4.0% (33) 0.0% (0)
Fair 8.4% (42) 9.4% (78) 6.8% (5)
Good 20.2% (101) 25.2% (210) 16.2% (12)
Very Good 33.3% (167) 29.6% (246) 40.5% (30)
Excellent 34.5% (173) 28.4% (236) 35.1% (26)
Unknown 1.6% (8) 3.5% (29) 1.4% (1)
For this visit, did you go to the emergency room in an ambulance? No 84.8% (425) 87.9% (731) 83.8% (62)
Yes 15.2% (76) 12.0% (100) 16.2% (12)
Unknown 0.0% (0) 0.1% (1) 0.0% (0)

Notes: Stars represent significant results from chi-squared test for each characteristic;
*0.05>p-value≥0.01, ***0.001>p-value
amutually exclusive groups

Significant differences in respondent characteristics were also observed by access method (results not shown); this is expected given the previously discussed relationships between access method and device type. For instance, respondents who typed in a URL code (mainly completing via computer) were more often older, non-Hispanic white, more educated, and more evenly distributed by gender than those accessing the web survey via an emailed or texted link (mainly completing via smartphone). Trends by QR code access are not discussed given the small number of web respondents who completed the survey via this method.

How many web survey invitations are needed?

Table 4 shows the distribution of web respondents by survey arm and the last protocol invitation or reminder sent prior to completion of the survey. These results are useful with respect to considering whether all invitations are needed. For example, if the large majority of web respondents complete the survey with the first or second email, then subsequent reminders emails may not be necessary. Table 4 shows that across all experimental arms, the majority of web respondents completed the web survey before the third outreach attempt regardless of the protocol component. For example, for arms with an initial email invitation plus three email reminders (arms 4, 5, and 6), 36% to 41% of web respondents completed after receiving the initial invitation link and 60% to 61% completed after the first email reminder was sent. In the arms that included a text component (arms 7 and 8), 55% to 64% of web respondents completed the survey after receiving a texted link. For arm 7 (initial text invitation followed by an email, text, and another email reminder), nearly half of web respondents completed the survey after the first text invitation.

Table 4.Distribution of Outreach attempt after which respondents completed the survey, by Survey Arm, % (N).
Outreach Attempt Outreach Method Arm 2:
Paper Invite + Email (x2) + Mail (x2)
Arm 3:
Paper Invite + Email (x3) + Mail
Arm 4: Email (x4) + Phone Arm 5:
Email (x4) + Mail + Phone
Arm 6: Email (x4) + Mail (x2) Arm 7:
Text + Email + Text + Email + Mail (x2)
Arm 8:
Paper Invite + Text + Email (x2) + Mail
Arm 9:
Paper Invite (URL only) + Email (x2) + Mail (x2)
1st Paper Invite (URL) 7.9% (11) 13.7% (25) -- -- -- -- 7.2% (15) 7.7% (12)
1st Paper Invite (QR code) 2.9% (4) 2.2% (4) -- -- -- -- 1.0% (2) --
1st Text Invite -- -- -- -- -- 47.9% (114) -- --
1st Email Invite -- -- 36.7% (54) 41.1% (67) 35.8% (62) -- -- --
2nd Text Reminder -- -- -- -- -- -- 55.0% (115) --
2nd Email Reminder 50.0% (70) 38.5% (70) 23.1% (34) 19.6% (32) 23.7% (41) 25.2% (60) -- 55.1% (86)
3rd Text Reminder -- -- -- -- -- 16.0% (38) -- --
3rd Email Reminder 39.3% (55) 23.6% (43) 18.4% (27) 12.9% (21) 19.7% (34) -- 21.1% (44) 37.2% (58)
4th Email Reminder -- 22.0% (40) 21.8% (32) 26.4% (43) 20.8% (36) 10.9% (26) 15.8% (33) --
Web Respondents (Among All Respondents) 29.7% (140) 45.6% (182) 21.1% (147) 19.4% (163) 33.8% (173) 39.9% (238) 48.6% (209) 33.1% (156)

Notes: This table reflects descriptive statistics. Outreach attempt represents the last outreach attempt sent via the associated outreach method before the respondent completed the survey; it does not necessarily represent the exact outreach used to complete the survey. For example, an individual in Arm 5 completed the survey after receiving an email reminder on their second outreach attempt would be counted in the 2nd outreach attempt email reminder row, regardless if they responded to the email reminder or actually clicked on the 1st outreach attempt email invite to complete the survey.

Discussion

This research examined ED web respondent paradata across a number of mixed mode protocols using varying invitation approaches and found that text message notifications pushed the largest proportion of patients to complete the survey via web, primarily using smartphones. We also found that smartphone users tended to be younger (18–54 years old), racial and ethnic minorities, less educated, and in worse self-rated mental health than computer and tablet respondents. We recommend text and email invitations and reminders to help reach and represent younger and ethnically diverse populations. Even though a mailed invitation with a URL link may bolster push-to-web efforts for older and more-educated patients, we do not recommend this invitation method be used alone given cost considerations, lower response rates, and less representativeness than other push-to-web protocols.

Access method was strongly correlated with device type. We have demonstrated that text message notifications (either as an invitation or a reminder) in a web survey contact protocol yielded the highest proportion of web responders on a smartphone. Slightly more than half of patients who responded to an emailed invitation used a smartphone to complete the survey. Mailed letters that required typing in a URL code appeared to drive respondents to complete the web survey on a computer. Although the analyses presented here were descriptive, it appears that the ED population—similar to populations included in general population web surveys— uses a range of devices to complete a web questionnaire, with a substantial portion opting for smartphones (Bosnjak, Bauer, and Weyandt 2017). Future web survey work should focus efforts on mobile-optimization of the web survey given the large number of smartphone users.

Of those ED patients invited to the web survey, only 9.8% interacted with the web survey. However, it is notable that nearly all (89%) who chose to interact with the web survey eventually completed the survey by web, suggesting that a web survey is a very well-received administration mode for those who can be contacted in this manner.

Our study had several limitations. Hospitals that participated in this experiment were not nationally representative and not all patients had access to the web survey, so these results may not generalize to all ED patients in the United States, nor do they necessarily generalize outside of the ED population. In addition, because our research questions focused only on patients who interacted with the web survey, we did not discuss respondents by mail or telephone; detailed results for all respondents can be found in Parast et al. (2019). While our examination of paradata generated from this research was exploratory and descriptive, it provides a glimpse of its value to researchers. Future investigations could build on the insights from paradata with larger-scale examinations of response patterns by device type and alternative push-to-web protocols designed to increase web survey completion rates. We hypothesize that even in other settings with potentially higher response rates and/or a different mix of patient characteristics, our lessons learned using paradata could be useful in terms of informing the selection of methods for invitation to a web survey and the design for a future study using similar modes.


Conflict of Interests

This work was supported by the Centers for Medicare & Medicaid Services, Department of Health and Human Services (HHSM-500-2016-00093G). The content of this publication neither necessarily reflect the views or policies of the Department of Health and Human Services nor does the mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. The authors assume full responsibility for the accuracy and completeness of the ideas presented.


  1. A QR code, which consists of multiple black squares within a square grid on a white background that can be read by an imaging device such as a camera on a mobile device, points to a website or an application.

  2. 99.9 percent of patients had a mailing address available.

  3. Patient eligibility requirements are described further in Parast et al. (2019) and were applied prior to sampling as well as post-sampling using updated administrative data that were provided. approximately 2–5 months after discharge. The overall post-sampling ineligibility rate for the full study was 2.6% (N=704).

  4. A completed survey was defined as one in which at least 50% of the items that were applicable to all patients were completed.

Submitted: May 12, 2022 EDT

Accepted: June 01, 2023 EDT

References

Baumgardner, Stephanie K., Griffin H. Doborah, and David A. Raglin. 2014. “2014 American Community Survey Research and Evaluation Report Memorandum Series ACS 14-RER-21.” In The Effects of Adding an Internet Response Option to the American Community Survey, edited by James B. Treat. Washington, D.C.: U.S. Census Buerau.
Google Scholar
Bergeson, Steven C., Janiece Gray, Lynn A. Ehrmantraut, Tracy Laibson, and Ron D. Hays. 2013. “Comparing Web-Based with Mail Survey Administration of the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Clinician and Group Survey.” Primary Health Care: Open Access 3 (1): 1000132. https:/​/​doi.org/​10.4172/​2167-1079.1000132.
Google ScholarPubMed CentralPubMed
Bosnjak, Michael, Robert Bauer, and Kai W. Weyandt. 2017. “Mixed Devices in Online Surveys: Prevalence, Determinants, and Consequences.” In Mobile Research, edited by Axel Theobald, 53–65. Wiesbaden, Germany: Springer. https:/​/​doi.org/​10.1007/​978-3-658-18903-7_5.
Google Scholar
Buskirk, Trent D., and Charles H. Andrus. 2014. “Making Mobile Browser Surveys Smarter: Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone.” Field Methods 26 (4): 322–42. https:/​/​doi.org/​10.1177/​1525822x14526146.
Google Scholar
Callegaro, Mario. 2010. “Do You Know Which Device Your Respondent Has Used to Take Your Online Survey?” Survey Practice 3 (6): 1–12. https:/​/​doi.org/​10.29115/​sp-2010-0028.
Google Scholar
Confirmit. 2019. “Confirmit.” https:/​/​www.confirmit.com.
Czajka, John L., and Amy Beyler. 2016. “Declining Response Rates in Federal Surveys: Trends and Implications.” Washington, D.C.: Mathematica Policy Research.
Labovitz, Jonathan, Neil Patel, and Israel Santander. 2017. “Web-Based Patient Experience Surveys to Enhance Response Rates.” Journal of the American Podiatric Medical Association 107 (6): 516–21. https:/​/​doi.org/​10.7547/​16-001.
Google Scholar
Lynn, Peter, and Gerry Nicolaas. 2010. “Making Good Use of Survey Paradata.” Survey Practice 3 (2): 1–5. https:/​/​doi.org/​10.29115/​sp-2010-0010.
Google Scholar
National Research Council of the National Academies. 2013. “Nonresponse in Social Science Surveys, A Research Agenda.” In Panel on a Research Agenda for the Future of Scoial Science Data Collection, Committee on National Statistics, edited by Roger Tourangeau and Thomas J. Plewes. Washington, D.C.: The National Academies Press.
Google Scholar
Parast, Layla, Megan Mathews, Marc N. Elliott, Anagha Tolpadi, Elizabeth Flow-Delwiche, William G. Lehrman, Debra Stark, and Kirsten Becker. 2019. “Effects of Push-To-Web Mixed Mode Approaches on Survey Response Rates: Evidence from a Randomized Experiment in Emergency Departments.” Survey Practice 12 (1): 1–26. https:/​/​doi.org/​10.29115/​sp-2019-0008.
Google Scholar
Stern, Michael J., Ipek Bilgen, and Bryan D. Rookey. 2012. “Toward Understanding Response Sequence in Check-All-That-Apply Web Survey Questions: A Research Note with Results from Client-Side Paradata and Implications for Smartphone Question Design.” Survey Practice 5 (4): 1–7. https:/​/​doi.org/​10.29115/​sp-2012-0020.
Google Scholar
U.S. Census Bureau. 2015. “2020 Census Operational Plan.” In A New Design for the 21st Century, edited by U.S. Census Bureau. Washington D.C.
Google Scholar
Wagner, James. 2019. “Estimation of Survey Cost Parameters Using Paradata.” Survey Practice 12 (1): 1–10. https:/​/​doi.org/​10.29115/​sp-2018-0036.
Google Scholar
West, Brady T. 2011. “Paradata in Survey Research.” Survey Practice 4 (4): 1–8. https:/​/​doi.org/​10.29115/​sp-2011-0018.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system