Introduction
To develop the best questionnaires possible, the Census Bureau routinely conducts expert reviews and a variety of qualitative testing: exploratory, cognitive, and usability interviews, eye-tracking, and focus groups. Typically, methodologists receive a pre-existing survey instrument, identify potential problems in the response process, and suggest a moderate number of improvements. They conduct qualitative testing and provide recommendations on potential changes and next steps. Subject matter experts (SMEs) are consulted throughout this process, but the methodologists typically take on most of the evaluation and testing work.
Currently, several of the Census Bureau’s flagship computer-assisted personal interview (CAPI) surveys are undergoing a modernization process that includes developing a web-based instrument for self-administration. This is a significant undertaking, especially since many of these surveys are long, complex, and ask about difficult or sensitive topics. One of the surveys modernizing is the Annual Social and Economic Supplement (ASEC) of the Current Population Survey (CPS) (see https://www.census.gov/programs-surveys/cps/about/modernization/asecmodernization.html). ASEC covers the following topics: health insurance, out-of-pocket medical expenses, earned income, other income sources, and poverty indicators. Redeveloping and thoroughly testing an ASEC web instrument, from exploratory interviews to field testing, will involve four years of effort and over thirty subject matter experts (SMEs) within the Census Bureau.
The teams of SMEs are the leading authority over the ASEC survey redesign. For the first round of cognitive testing, they were eager to rewrite questions and introduce new question formats in order to lower respondent burden and leverage the visual medium and technology of the web. With the SMEs leading the question revisions, the usual dynamics of collaboration between them and the methodologist changed. Although in a supporting role, the methodologist was still responsible for incorporating methodology principles and further refining the questions for qualitative testing, especially since question design and testing methodologies were not necessarily part of the SMEs’ backgrounds.
Given the complexity of the instrument, multiple stages of qualitative testing, central role of the SME teams, and extensiveness of the changes to be tested, an enhanced level of documentation and coordination was necessary. The methodologist needed a solution to not only collect detailed information about each survey question but document the SMEs’ motivations for changing questions so that her question revisions aligned with the goals. Therefore, a template was devised to gather and organize essential information about the proposed survey questions and provide a structured platform for iterative communication between the methodologist and SMEs.
Template
The proposed template was simple. For each survey question, the SMEs first reported the Research Question. This provided a succinct communication of the main concern(s) with the (newly designed) survey question, especially concerns about the success of the question without an interviewer present. The concerns could pertain to, e.g., vocabulary, overall comprehension, the potential need for help text or instructions, and usability concerns for new question formats. Next, the SMEs provided the complete survey question, including the question stem, response options, and fills. Finally, the template noted what was to be reported back from the cognitive testing. SMEs were encouraged to ask for: situations where the survey question failed, alternate wording provided by the respondent, and counts of certain behaviors or responses. This three-part sequence of information was to be applied to every survey question included in the revised questionnaire, even if the question was the same as in the CAPI version and/or the SMEs had no concerns about the respondents’ abilities to understand and answer the question.
The template was initially presented in a kick-off meeting with SMEs as a PowerPoint presentation (see Figure 1). The presentation also encouraged SMEs to designate “low priority” survey questions that could be dropped from the test questionnaire, if needed, to save time. These survey questions were designated by light grey, rather than black, font.
The SMEs were receptive to the template and grateful to have some structure to their task. The presentation was converted to a guidance document in Word with additional examples and instructions (e.g., using an automatic table of contents, inserting horizontal lines to separate the survey questions). The examples provided were from another Census Bureau survey, so as not to influence the ASEC revisions (see Figure 2). As the SMEs used the guidance to create their own templates in Word, further improvements were suggested (e.g., universes and detailed instructions on question fills were included for each survey question). The template became an essential document, recording all relevant information about the survey questions.
Sometimes the SMEs added information or used different formatting. For example, one team formatted each survey question as a table, inserted skip instructions, and specified the question type (see Figure 3). Their embracement and adaptation of the template was a positive response.
The content of the template was passed between the SMEs and methodologist several times for each section of the survey to continually refine the questionnaire and goals for testing. As intended, the methodologist used the template to guide her revisions and develop meaningful probes that would give the SMEs the information they needed. The final version was provided to the contractor conducting the qualitative data collection so they too could understand the testing goals of each question and what information was required from the interviews.
Additional Applications of the Template
Beyond the development of the survey questions and testing protocol, there were a couple unintended outcomes of the template that supported other steps in the cognitive testing process. With the universes included and heavily scrutinized by the SMEs, the template was used to provide specifications to the programmers. Also, the template became the backbone of the cognitive testing report prepared by the contractor. For each question tested, the contractor included the SMEs’ Research Question and survey question text. When reporting the findings from each question, the contractor used the information from the Report back or the specific probes (which closely mimicked the Report back) as subheadings. See the appendix for an example.
Conclusions
The template was straightforward for the SMEs to use and motivated them to thoughtfully consider each CAPI question and what was important for this redesign. It provided essential support to the development and testing process, and facilitated communication and cooperation across teams. Regardless of the SMEs’ inexperience with qualitative interviewing or questionnaire design, this trial found that the template was an easy tool for non-methodologists to apply.
Even though the magnitude of the ASEC redesign effort is unique, we would recommend the template for any size collaborative project. In our experience, it fostered communication between the sponsor and methodologist and provided helpful documentation. As methodologists, we also found it advantageous that the template used Word, where it was easy to comment on specific terminology or segments of the questions, track changes, and visualize the instrument.
One lesson learned during this process was that the subtle difference between the Research Question and the Report back was not immediately clear to everyone. An effective clarification was that the information in the Research Question communicates the main concern(s) with the question. The Report back details indicators of that concern and many of those are translated into probes. Both are helpful for the methodologist to suggest additional probes and reporting, or improved question wording.
The template has established its place in the ASEC modernization effort’s toolkit and is being implemented for the next round of cognitive testing. The SMEs have noticeably improved both the format of the templates and their questionnaire design skills. The template, used in conjunction with the methodologist’s comments and edits, has imparted some basic questionnaire design principles, providing for a more efficient and harmonious redesign process.
Corresponding author contact information
Jennifer Sinibaldi
Appendix
Findings: [NOTE: INCLUDES FICTIOUS FINDINGS TO DEMONSTRATE CONTENT]
Probe 1: How did you count the number of weeks looking for work or on layoff? Probe if counted partial week(s)
-
Only four participants received this question.
-
One said “all year” and entered 52.
-
One participant said “it was about 2 months” and entered 8 weeks.
-
Two participants recalled the exact number of weeks; however, they both rounded up to the nearest week.
Probe 2: Do you know how many weeks are in a year? Or are you a little uncertain?
- The three participants who did not enter “52” knew that there are 52 weeks in a year.
Overall, the participants found this question easy to answer. Participants understood that there were 52 weeks in a year. There may be some concern about recall, but participants were able to determine the number of weeks they were looking for work or on layoff in 2023 and rounded to the nearest week when responding.
Recommendations:
No recommendations. The question was easily understood by participants.