Introduction
Over the past decades, survey research has attempted to evolve alongside changes and advances in technology and interpersonal communication. Generally, survey data is collected using a variety of methods, including face-to-face, mail, telephone, and web surveys. In selecting the appropriate mode of data collection, researchers need to take into consideration time constraints, budgets, and estimate response rates, and across all modes, researchers are faced with many of the same problems – costs are increasing, response rates with traditional methods are dropping, and reaching diverse and representative samples of people using one mode is difficult (Groves 2011).
At the present time, in terms of communication, there is no doubt that mobile phone technology is exploding and usage becoming prolific; in fact, “the cell phone is the most quickly adopted consumer technology in the history of the world” (Raine 2013, 1), and it is reported that 91 percent of all American adults own a cell phone (Raine 2013). Additionally, of all cell phone owners, 79 percent say they use text messaging on their phones (Pew Research Internet Project 2014). Of particular interest to survey researchers, some traditionally harder-to-reach populations, such as low-income and minority populations, have been quick to adopt mobile phone technologies, and use text-messaging services a higher rate than others (Smith 2011). Thus, in this day and age of prevalent mobile phone ownership and texting, exploring text messaging as a means to conduct survey research with a broad range of citizen is judicious.
The aim of this study is to explore whether automated SMS text messages can be used as a primary mode of data collection, and specifically, whether survey takers will respond to a “cold text” to participate in a short survey in a regional setting. We hypothesized that because of the prevalence of cell phone ownership and high rate of text message usage, particularly for traditionally harder-to-reach populations, contacting a sample via text messages may be an effective and less expensive alternative to more conventional data collection methods. While we did expect low response rates due to recipients’ unfamiliarity with text-based survey and the potential for it to be seen as an intense personal intrusion, we saw this as an important opportunity to explore this relevant technology in measuring public opinion for social science research purposes.
Background
To date, there have been a few preliminary explorations in the use of text messaging in social science research (Buskirk, Callegaro, and Steeh 2004; Jespersen et al. 2013; Walker 2011). All studies that have used text messages to support other means of data collection have concluded that text messaging is beneficial in increasing participation and response rates, although no true experiments have been conducted. Steeh, Buskirk, and Callegaro (2007) conducted an experiment comparing the response rate of a traditional cold-call phone survey against the treatment condition of sending a text message invite to call in and participate in a survey. He found response rates associated with the text message invite to be lower than the cold call. Three known studies have asked participants to respond to questions via text (Cooke, Nielsen, and Strong 2003; Johansen and Wedderkopp 2010; Kuntesche and Robert 2009), but all respondents were already enrolled as study participants and none received a “cold text.” In all cases, study participants were already enrolled in an establish study or were members of an online panel. Of importance, Cooke, Nielsen, and Strong (2003) also established reasons for online survey panel participant’s refusal to participate in text-message based research, which included the personal relationship that they have with their mobile telephone, the intrusive nature of SMS, the perceived cost of receiving [surveys] and of sending their replies, the volume of SMS spam, the limited in-box storage capacity on mobile phones, and the belief that online surveys are more convenient (Cooke, Nielsen, and Strong 2003).
Given the current landscape, it is clear that there is a substantial interest in utilizing text messages in social science research, and those studies that have employed the technology have seen positive results. However, there is much about using text messages that is presently unknown. In particular, it is unclear how people will respond to a cold text, and how they will participate in a survey conducted exclusively via text. The study described in this paper attempts to address this gap in the literature.
Research Methodology and Design
Research Questions
- How will people respond to receiving an invitation to participate in a “cold text” survey?
- In what ways will they respond?
- Who will participate?
- How long after receiving the survey invitation will they respond?
- If people begin the survey but do not complete it, when do they stop?
- What are reasons that people do not respond?
- How are the survey completion rates related to knowledge of the topic?
The results are presented below sequentially align with these questions.
Instrument Design
To achieve the goals of this study, the Institute for Survey Research (ISR) at Temple University partnered with The Friends of the Wissahickon (FOW), a local non-profit organization overseeing the well-being of a local park. FOW wanted to survey nearby residents of Philadelphia, Pennsylvania, to better understand their use of the park and their awareness of the organization. An 8 to 9-question survey was designed containing several skip logics built into the survey, so that depending on one’s response to earlier questions, different sequences of questions would appear (see Appendix A).
Sample
To administer the survey, the project team acquired a simple random sample of 1,000 mobile numbers associated with the six zip codes that surround the park. Appropriate cell phone number identification was obtained through a sample of the “rate center”[1] bordering the park in Philadelphia to which billing zip codes were then appended.
Data Collection
Data collection for this study took place over the course of one full week, beginning on Saturday, November 15th, 2014, and culminating on Friday, November 21st. On the first Saturday at 10:00 a.m. local time, a series of two informational and recruitment/consent texts were sent to the 1,000 randomly selected individuals in the sample file. All text messages were sent from a local Philadelphia phone number (area code 215). The two-part text stated:
- Informational text “Hi from Temple’s Institute for Survey Research! We have a few Qs for you, participation is voluntary – text QUIT anytime or contact Nina Hoe at
215.204.4441” - Recruitment/Consent text ““Would you answer a few Qs via text about Wissahickon Park? (You’ll be
entered to WIN an iPAD mini) Reply YES or QUIT””
Two days later, on Monday, November 17th at 5:00 p.m., a follow-up reminder text was sent to all recipients who had not responded in any way to the original texts. The survey was left open for one week. Participants who replied “Yes” received a series of multiple choice and open response survey questions. Subsequent questions were only sent once the participant had replied to the previous question.
Methods of Analysis
To answer the designated research questions, several methods of analysis were employed. Data generated from survey responses were merged with the original sample file, and imported into Stata 12 for quantitative analysis, including summary and descriptive statistics. Rates of participation and response, completion status, survey response and the reliability of demographic data were analyzed in this manner.
To better understand text-message recipients’ reasons for not responding to the text, follow-up phone calls were administered to a random sample of non-respondents. Non-respondents were asked to indicate their reason(s) for not responding, and answers were transcribed and coded thematically (see Appendix B).
Results
Result 1: Survey Response Rates
Of the 1,000 cold text survey invitations, a total of 70 (7 percent) began the survey and 36 (3.6 percent) completed it. Using the American Association for Public Opinion Research (AAPOR) RR1 Calculator and the following equation, we calculated the overall response rate of 3.6 percent. One-hundred-and-seventy-two people “unsubscribed” from the study. In total, 242 recipients engaged with the text message in some way. Table 1 shows the frequency and percentage of participation and completion.
Result 2: First Responses
As a methodological exploration, this study was not only concerned with response rate, but also with the general response(s) to the survey, and specifically how recipients interacted with the automated texts. A total of 242 individuals responded to the “Recruitment Text” in some way. The initial responses fell into five categories, classified as (1) correct consent (e.g., “YES” or “Yes”); (2) incorrect consent (e.g., “okay,” “y”); (3) correct non-consent (e.g., “QUIT” or “Quit”), (4) incorrect non-consent (e.g., “No,” “OUIT,” “WUIT,” “not now”); and (5) other/consent intention unclear (e.g., “Call me please,” “What is the survey”).
Table 2 shows the distribution of responses. In total, 15 percent of respondents incorrectly responded – either in attempt to consent or non-consent. This suggests that an open response format may be challenging for participants, and perhaps modifying the instructions to asked people to reply with “1-Yes, 2-No” (as is standard in other platforms) may result in more reliable answers.
Result 3: The Participants and Demographic Data
Understanding the demographics of respondents was also a goal of this study. Despite the fact that the sample frame data provided demographic information for each phone number, the survey asked participants to report three demographic characteristics – their home zip code, age, and race. We found that 43 percent of the sample reported living inside the targeted area, while 45 percent reported living outside of it. An additional 12 percent did not specify their location. In terms of age, the median age of the respondents was 46 years old, with a range of 21 to 72. In terms of race, in responding to a multiple-choice question, 50 percent of respondents self-reported as White, while 31 percent self-reported as Black or African-American. Only 3 percent as Hispanic or Latino, while none of the respondents identified as Asian.
As a secondary point of investigation, this study sought to explore how the participants’ self-reported data aligned with the sample frame data. Forty-nine percent of zip codes reported by the survey participants matched the zip codes indicated in the sample frame data, while only 32 percent of ages reported by the participants matched the age (+/– 1 year).
Result 4: Response Timing
Another goal of this study was to understand when people would respond. Data on the time of first response was only collected for 96 individuals – or those who responded in a way other than “Quit” (which included those who consented, as well as those who attempted to not consent, but did so incorrectly). Forty-seven percent of recorded responses (45) came from the initial text invitation, while the other 53 percent (51) came following the reminder text. The highest volume of response came on the same day that the initial and reminder texts were sent; overall, 88 percent of responses were recorded on the same day as the initial or reminder text.
In terms of timing, 47 percent of all responses came within the first hour of the invitation text or reminder text. Specifically, 23 people responded to the first invitation text within the first hour and 22 people responded within the first hour to the reminder text. Subsequently, 66 percent of all responses came within the first two hours of text receipt, 72 percent came within three hours, and 78 percent came within four hours. Because the reminder text was not sent until 5:00 p.m., it is unclear whether sending something earlier in the day would have increased the response rate for the day. Table 3 shows the distribution of responses across the week and hourly from the time of sent text.
Result 5: Drop-off and Question Type
Throughout the survey, individual item response rates ranged from 76 percent to 100 percent. These rates were calculated by dividing the number of people who responded to the question by the number of people who received the question (which was based on responses to previous questions). The survey itself contained questions that fell into three categories; Table 4 shows the average response rates by item category.
The awareness and behavior questions yielded the highest response rates – 94 percent, 95 percent, 93 percent, 100 percent and 100 percent individually. Second, the two informational statements, each followed by the question, “Ready to move on?” yielded response rates of 76 percent and 89 percent. Finally, response rates to the demographic questions were relatively high – 81 percent for zip code, 97 percent for age, and 89 percent for race.
Result 6: Non-response Reasons
Another major goal of this study was to understand the reasons that people would not want to respond to the cold text asking them to participate in a text-based survey. Follow-up phone calls were made to a random sample of 460[2] of the 758 non-respondents and trained phone interviewers made contact with 116 individuals. Of the 116, 60 people were willing to share their reason(s) for not responding to the text. Overwhelmingly, the two most common reasons reported for not responding to the text message survey invitation were not understanding the source or reason for the text (30 percent) and being too busy or occupied to respond (26 percent). Table 5 shows the frequency of non-response reasons.
Responses falling into the first category, such as, “I had no idea who it was from and how my number was selected,” indicate a level of skepticism, likely stemming from the fact that receiving a text message invitation to participate in a survey via text is a completely unfamiliar experience for recipients. People reported that when they received the text, they were working, driving, eating or dealing with a medical problem (e.g., in the hospital or on dialysis). Others described that they just had too many other things going on that this was not a priority. Of interest, in contrast to previous studies that found the cost of text messaging to be a major deterrent to participating in text-based research, only two people in this study (3 percent) reported that they did not have an unlimited texting plan and paid for individual texts. In addition, three recipients reported that they do not text, or use text messaging on their phones, and one indicated that it was because he does not know how to read.
Result 7: Role of Local Context
Analyses indicated that survey completion rates were also related to being familiar with the context (answer to the first question). Results of a chi-square test (p=0.001) on the tabulation of first question responses and survey completion status indicate that whether or not an individual completed the survey was significantly related to his or her answer to the first question (i.e., whether or not he/she was aware of the Wissahickon Valley Park). As shown in Table 6, 84 percent of people who were aware of the Wissahickon Park completed the survey, as compared to only 43 percent of those who were not aware. Also of interest, one-third of those who were not aware of the Park unsubscribed from the survey at some point, while no one who was aware of the Park did.
Discussion
Given the cost-efficiency of web-based SMS platforms (estimated at 1/10th the cost of an RDD phone study), it is worth continuing to explore this work, despite an initially lower than desired response rate. In addition, we also found that minorities and older citizens will participate – which dispels previous concerns indicated in the literature regarding coverage bias of text messaging. Data from this study also furthers the understanding of cell phone user behavior in general. The 51 percent mismatch between the zip codes reported by the respondents and the zip codes appended from the billing information, suggests that mobile users may be mobile and moving, or at least living in places different from where their billing address was established. Additionally, mismatching on other demographic information, such as age, is revealing of differences between ownership and use. Parents may pay for the numbers and phones but give phones to children; businesses may provide employees with numbers and phones as well.
This study also provided insight into the ways in which recipients interact with text-based surveys and text messages in general. Follow-up responses showed that a large proportion of non-respondents cited receiving the text message at a time when they were busy or occupied, as a reason for not responding. At the same time, survey para-data indicated that the majority of responders did so sooner rather than later. Coupled together, these findings suggest that text messaging may be seen as demanding immediate attention, or perhaps as only being meaningful in the present. Future work may suggest reassuring respondents that they can respond whenever they are free, and also utilizing multiple reminder texts. In addition, a larger proportion of respondents cited not understanding the source or reason for the text. As indicated, the informational text specified “Temple’s Institute for Survey Research,” but those not familiar with Temple may not have known that Temple is a University and that ISR is an academic research organization. Being explicit about the legitimacy of Temple as a university may be important in reassuring skeptics about the origin of the text message.
Finally, findings from this study suggest that because of the mobility of mobile users, the use of a local context may not be an access point. Only 43 percent of respondents actually lived in the targeted area and only 60 percent knew the topic of the survey, indicating that the attempt to use a local topic may have served as a barrier.
Limitations
This study confronted several limitations – many of which are inherent to working with cell phone frames in general. For example, coverage excludes those who do not own a cell phone and current cell phone holders may not be the same people for whom the demographic information was reported. However, using the SMS text feature of cell phones also faces greater coverage issues and additional barriers, such as only being able to reach those who have text-enabled phones and those who are literate. In addition, text messages are currently limited to 160 characters. Users also faced issues with the text survey platform, as indicated by 15 percent incorrect first responses, have the potential to compromise data reliability. Finally, this study was limited in the fact that demographics characteristics were only known for those who completed, precluding the possibility of comparing completers to partial and non-respondents.
Directions for Future Research
This study suggests several areas for further research, informed primarily by the qualitative feedback provided by the non-respondent group. First, because several non-respondents expressed skepticism about the origin of the text, future surveys will include our full name – “Temple University” – in the introductory text. Second, addressing recipient confusion about how their numbers were selected will be important for future work. Researchers may consider stating, “your phone number was randomly selected.” Third, non-respondent reports of feeling too busy to respond may be remedied by prefacing the survey with the statement: “please feel free to respond at your leisure.” Some sources have estimated that Americans spend a total of 37 billion hours waiting in line each year (Stone 2012). Figuring out a way to harness some of this general “waiting time” may provide fertile ground text-message surveying. Fourth, the finding that a portion of non-respondents did not respond because they were not familiar with the topic highlights need for conducting similar research in non-local contexts. We see this as an opportunity to potentially reach a larger and wider range of participants.
Additional suggested future research includes administering text-based surveys with a panel of respondents who understand the platform, experimenting with multiple reminder texts, and following up with drop-offs. Finally, SMS text-based surveying has important practical implications for researchers in war-torn countries and those in civil unrest. In these regions, sending out interviewers to conduct face-to-face interviews is dangerous, and phone-based interviewers can be both expensive and non-confidential if respondents are in the presence of others. In addition, these regions are also places where both mobile phone ownership and text usage are prolific. ISR is currently conducting pilot work with researchers in conflict zones.
Acknowledgements
Thanks to our partners in the study – Friends of the Wissahickon and Textizen – for their collaboration and support. This study was previously presented at the World Association of Public Opinion Research conference in Doha, Qatar in 2015 and the American Association of Public Opinion Research annual conference in Hollywood, FL in 2015.
Appendix A. Text Survey Questionnaire
Survey Questions (MAX CHARACTERS: 160)
(A) – ALL : Informational message: Hi from Temple’s Institute for Survey Research! We have a few Qs for you, participation is voluntary – text QUIT anytime or contact Nina Hoe at 215.204.4441
… 5 seconds later…
(B) – ALL: (Recruitment Text): Would you answer a few Qs via text about Wissahickon Park? (You’ll be entered to WIN an iPAD mini) Reply YES or QUIT
(C) 1 – (If Yes to B) Text: Are you aware of Wissahickon Valley Park (home of Valley Green)?
- Yes
- No
(D) 2a – (If Yes to C) Text: Do you currently use Wissahickon Valley Park?
- Yes
- No
(E) 2b – (If No to C) Text: The Wissahickon is an 1,800-acre park in NW Philadelphia. It’s part of the City’s 10,500-acre park system, one of the largest urban park systems in the world.
(F) 3 – ALL: Text: What is your home zip code?
(G) (4) – (If Yes to C) Text: In a short phrase, what does the Wissahickon Park mean to you?
(H) 5 – ALL: Text: Do you know about Friends of the Wissahickon (FOW)?
- Yes
- No
(I) 6a – (If Yes to H) Text: Are you a member of Friends of the Wissahickon (FOW)?
- Yes
- No
(J) 6b – (If No to H) Text: FOW ( www.fow.org) is a nonprofit founded in 1924 “to preserve the natural beauty & wilderness of the Wissahickon Valley and stimulate public interest.”
(K) 7 – (If Yes to Text I) Text: Are you aware that FOW raises and spends over $1.5 million per year to improve the park?
- Yes
- No
(L) 7 – (If No to Text I) Text: FOW raises and spends over $1.5 million per year to improve the park.
(M) – ALL: Text: Now just a few quick questions about you.
(N) 8 – ALL: Text: What is your age?
(O) 9 – ALL: What is your race?
- Asian
- Black or African American
- Hispanic or Latino
- White
- More than 1 of above
- Other
(P) – ALL: Text: Thanks for your time! To get involved with FOW reply w/your EMAIL. For more info about ISR visit www.temple.edu/isr. You’ve been entered to win an iPAD mini!
Appendix B. Non-Response Follow-up Phone Call Protocol
Interviewer note: As an academic research organization, we are exploring the use of automated SMS text messages in social science research. Hearing from the public is important to us, and we are seeing how people respond to being contact through text message.
(1) People who DID NOT RESPOND AT ALL (did not touch text)
- Dial Number
No answer ➔
- “Hello, my name is _______ from the Institute for Survey Research at Temple University. This past Saturday we sent you a text message asking whether you would be willing to take a short survey via text message. For our own research purposes, I am calling to learn why you chose not to participate. If you have a minute, we would love to hear your thoughts on the survey. Please call me at 215.204.4441. Thank you in advance for your time.”
Pick up ➔
- “Hello, my name is _______ from the Institute for Survey Research at Temple University. This past Saturday we sent you a text message asking whether you would be willing to take a short survey via text message. For our own research purposes, I am calling to learn why you chose not to participate.
2. Do you remember receiving this text message?
a) Response: Yes ➔
- For research purposes, would you be willing to tell me why you don’t want to participate? Are there any reasons in particular?
- Response: [OPEN]
- What is your home zip code?
- Response: [ZIP]
- What is your age?
- Response: [AGE]
b) Response: No ➔Is your phone equipped to receive text messages?
- Response: No
- Is that a feature of your phone or a setting you put in place?
1. Response: Feature of Phone- Okay, thank you for your time. Have a good day.
2. Response: A setting I put in place
1. What prompted you to put this feature in place?
1. Response: [OPEN]
Thank you very much for your time. Your insight is helpful to us as a research institution. Have a great day.
2. Response: Not sure
- Thank you very much for your time. Your insight is helpful to us as a research institution. Have a great day.
- Response: Yes
- Okay, would you mind checking again to see if you received the text? It was sent this past Saturday, November 15th at 10am. Do you see it? I can give you a minute.
1. Response: Yes- Would you be willing to take the survey when we hang up?
- Yes
- No
1. For research purposes, would you be willing to tell me why you don’t want to participate? Are there any reasons in particular?- Response: OPEN
2. What is your home zip code? 1. Response: [ZIP] 3. What is your age? 1. Response: [AGE]
2. Response: No – don’t see it/can’t find it.
3. Response: No –don’t want to look for it
ALL END with: “Thank you very much for your time. Your insight is helpful to us as a research institution. Have a great day!”
Thank you very much for your time. Your insight is helpful to us as a research institution. Have a great day.
A rate center is a geographical area used by a local exchange carrier (LEC) to determine the boundaries for local calling, billing and assigning phone numbers.
Because of both time and budget constraints, only 460 numbers were dialed.
Because some respondents indicated multiple reasons for not responding to the text message invitation, the total frequency of responses is greater than 61 (and the sum of the percentages is also greater than 100 percent).