Advancement in mobile technology offers survey researchers a new set of tools for collecting information on how people spend their time. The increasing popularity of social media platforms such as Facebook, Twitter, etc., promotes an online medium for the general public to share their day-in-the-life activities with friends, family or whoever may be interested. In this digital age, a diverse group of survey researchers can leverage this trend to make observations of their subject of interest remotely and wirelessly.
The Life360 practice by The Nielsen Company uses a “digital ethnography” approach to measure attitudes, preferences and behaviors of the targeted population using mobile phone surveys, photography, Internet-based journals, video cameras and Web surveys. The digital age presents a unique opportunity to conduct a new form of ethnographic fieldwork by leveraging the ubiquitous nature of mobile, video and Internet-based technologies in place of the researcher. The goal of Life360 is to capture what is happening in people’s lives to depict everyday environments, lifestyles and personalities by conducting “in-the-moment” surveys from the population of interest.
There are several data collection techniques used to piece together a whole picture of day-in-the-life activities of respondents during the survey period. The central component is the administration of recurring timed surveys via a smartphone. The specially equipped smartphone would prompt the respondents to complete a short survey on an hourly basis in addition to capturing an image using the built-in camera as a pictorial description of their surroundings and activities in real-time. These respondents would complete an Internet-based pre-survey at the start of the field period to provide basic demographic information as well as some baseline behavioral and attitudinal data; and a post-survey to capture greater details of behaviors and attitudes throughout the survey period, or monitor any change. They were also asked to complete an online e-journal to record their thoughts about their day in an open-format on the Internet like a blog or paper journal. Each of these components makes up a full picture of the life of a respondent during the survey period.
The digital ethnography approach utilizing a mobile device is well-suited for time use studies that measure how people choose to spend 24 hours of each day. Paper questionnaires and telephone interviews have been the most typical survey methodologies to record the daily activities of respondents for time use surveys. The paper questionnaire often involves a complicated format of time grids for respondents to indicate when they began and ended their activity, write down the activity they perform, etc. This could be a burdensome task in addition to the risk of retrospective bias since there is always the likelihood of respondents erroneously recalling their activities. The telephone surveys also rely upon retrospective accounts of how the respondents spent their day or even week, which is much less precise than data collected at the moment of activity. Although few time use surveys have utilized a portable electronic device or smartphone for data collection, it is relatively certain that electronic tools can help minimize the retrospective bias of time use surveys using paper or telephone methodology.
To learn more about respondents’ experiences in using these devices and participating in a Life360 study, The Nielsen Company conducted cognitive interviews with a random sample of participants from a 2008 study completed by Life360 on digitally saavy consumers. A total of eleven respondents participated in the cognitive interviews with specific recruitment procedures to ensure they were representative in terms of demographic characteristics of gender, age, household income, geographic location, etc. (see Table 1). These one-hour long cognitive interviews were conducted over the telephone with reference materials emailed to respondents in advance, and in return they received a $25 cash contingent incentive for their participation. The interviews focused on key areas of smartphone feature utilization, interface design, general usability, and task compliance.
Technical Features of Mobile Device
The respondents were asked to recall their interaction with the smartphone during the survey period specific to the technical features starting from when they were prompted to complete the survey. The volume of the alarm reminder for the survey presented a challenge to respondents who were bringing the smartphone to their workplace. The alarm function of the smartphone was limited to two levels of volume, high or silent. If the phone was silent, the respondents would often miss the reminders especially when they snooze – there was a 10 minute pause to the next reminder. It was apparent that a vibrate option and volume control would be useful to accommodate the surroundings the respondents were in during the survey period, which is a feature that has been initiated for all subsequent studies. Furthermore, respondents can now initiate surveys based on specific behaviors or situations (e.g., when they are engaging in common activities such as consuming food and beverage, attending sporting events, etc.).
Interface Design & Usability Evaluation
There were a few questions in the survey that required the respondents to scroll to the next page to see a complete listing of response options due to the limited screen size of the smartphone (see Figure 1). Most of the respondents were aware of the scroll down arrow located at the bottom right of the screen, but were “annoyed” to have to scroll to see a complete listing. After taking a dozen of these surveys, the respondents would become familiar with the order of the response options they would typically select for each question. The response options were simplified into broader categories to minimize the need for scrolling.
Given the limited size of the smartphone screen, the radio buttons for the response options allowed only a small area for respondents to select their responses. Even though each smartphone was accompanied with a stylus for respondents to use if they preferred not to use their fingers to select on the touchscreen, a number of respondents still had trouble with the precision of selecting their response (see Figure 2). A larger tap area to allow respondents to tap on the text of the response option for selection can be considered for the touchscreen.
Compliance of Task
The two key motivators that kept the respondents complying with the task of completing these hourly surveys were the contingent incentive offered and their curiosity/interest in the research project. Also, since all of these respondents have already established a relationship with Nielsen through another research initiative, they were generally less skeptical in participating and more willing to comply with the task as well.
Most of these respondents stated the strongest motivator was the amount of contingent incentive offered for the task. The incentive plan was based on pay for performance, in other words, the respondents would earn $100 for completing the mobile diary portion of the survey, but could earn a higher amount at the end of the survey period upon completion of all tasks. In fact, when asked for suggestions on other types of incentive payment, over half of these respondents said they would have liked to keep the smartphone since they became familiar with its functionalities. They also expressed interest in participating for a longer period of time if they were offered the smartphone with an unlimited data and voice plan.
Ultimately, these respondents also participated and complied in sharing such detailed information about their life over the survey period because they were interested and/or curious about the research project. One respondent echoed similar responses of others on their willingness to participate: “At first I was uneasy with all the detailed information, I thought, why do they want so much? But as I went through it, I learned so much about how I spent my time throughout the day. It opened my eyes on how I spent my time, doing so much laundry when I should be spending time with my kids.”
The cognitive interviews revealed a learning curve for the respondents at the start of the survey period, such as data input or selection, but most of the respondents would adjust to the usability of the device after the first few days. It was also clear that simple improvements on the technical features such as allowing for better adjustment on volume control for the reminder alarm could potentially increase the compliance of the task. This type of improvement has already made a significant difference for study participation in longer-term panels. Finally, a user-friendly questionnaire design with limited use of scrolling by collapsing response options to broader categories or data entry to collect open-ended responses on the mobile device can improve overall data quality for the duration of the survey period.
There are limitations to these findings that should be considered for further research. While the qualitative nature of these findings offered unique insight to the interface design and usability of the mobile survey, specific research on questionnaire format/administration such as: (1) text font/style/size/effect based on the dimension of the smartphone specifically used for the survey, (2) response order effect of the same survey administered multiple times on a daily basis over an extended period, (3) anticipation bias over the frequency of these surveys administered daily, etc., could help to gain greater compliance for longer-term panels. Nonetheless, the concept of digital ethnography in leveraging the mobile device as the “participant-observer”, along with the capability to customize the hardware and applications to meet the needs of specific research studies, opens doors to capture an enhanced level of details that is unattainable through the traditional survey approaches.