Many studies ask respondents for information from their past. Unfortunately, this information can be hard for respondents to remember. Event history calendars (EHCs) aid respondents’ memories by giving visual cues that help them place past events on a timeline.
Most literature on EHCs focuses on interviewer-administered tools. For example, Freedman et al. (1988) describe the EHC as a large grid within a survey tool. One grid dimension lists the components being studied while the other dimension lists the time units. The interviewer fills in the cells as the respondent reports the events.
Information from respondents’ pasts was critical to our recent research project. The goal was to understand the reasons individuals enlist in the Army. Of particular interest were “older” recruits: those who could have enlisted at age 18 but instead enlisted later. In addition to responses to survey questions, we wanted to collect detailed academic, social, and economic information from the time participants left high school to the time they enlisted to understand what happened in these individuals’ lives that prompted them to enlist later. With Army guidelines allowing enlistment up to age 42, this could be 25 years of history for some participants. Cost and time constraints prevented us from using a one-on-one, interviewer-administered model for collecting EHC data, so we designed a version that could be self-administered in group sessions.
Our modified EHC looked much like a traditional EHC: a large grid with columns labeled by year and rows for each question (see Figure 1). Some items included check boxes, and other items asked participants to write letters or numbers in the grid to document their response.
Although the modified EHC looked like a traditional EHC, there were six distinct differences in the instrument and in the administration.
- Interviewer Cues Became Respondent Cues. Traditional EHCs include cues interviewers use as reminders about questions to ask respondents. We modified these interviewer cues into respondent cues by including the full question wording and all response options when possible. We used these cues for questions that would not need further clarification from an interviewer, such as basic demographics, family, and educational history.
- User-Friendly Layout. Traditional EHCs can be cluttered with ancillary information, which interviewers are trained to use but would be confusing to a respondent. On the modified EHC, we displayed only the necessary information. We laid out the EHC in a clear table format with bold headings. Alternating rows were shaded gray to help keep respondents aligned from question to response. Large black arrows guided the respondent from the question stem to the response location. We also added “hints” to prompt the respondent the way an interviewer might. (“Hint: You can have more than one letter per box.”)
- Item-by-Item Explanation. In traditional EHC administration, the interviewer explains each item to the respondent. In our group administrations, a lead survey administrator reviewed each item with the large group. The proctor then instructed the respondents on how to fill out the EHC.
- Interviewer-Tailoring Became Self-Tailoring. Traditionally, the interviewer tailors the EHC and asks only the questions that apply to the respondent. For our modified EHC, we used generic markers to make the EHC easier for respondents to tailor for themselves. For example, we asked respondents to provide information from the time they left high school until the time they signed their current enlistment contract. To avoid reporting outside of this assigned window, we instructed all respondents to cross off all the years they were in high school, a task each respondent could carry out individually.
- Visual aids. We developed a set of detailed visual aids for our group-administered EHC. Each slide contained an example of the current item the proctor was working on with the group. The slide showed the respondent where to fill in their information and how to document their responses.
- Individual Assistance. Traditional EHC methodology allows the interviewer to provide encouragement to the respondent when needed. For our group administrations, we trained two survey administration assistants. They helped respondents who appeared confused or were not following along.
We found the modified EHC to be a successful tool on a number of levels. First, we had a low refusal rate. Less than four percent of those invited refused to participate in any part of our project, and less than three percent of those who completed any part of the survey did not complete the EHC. These rates show that although the self-administered EHC is complicated, respondents are willing to complete it.
Second, we found that even the oldest respondents filled out the EHC completely (see Figure 2). The respondents were able to follow the visual cues on the survey instrument and on the projected slides.
Finally, we purposely asked some questions twice: once on the survey portion and again on the EHC. We decided to retain these repetitive questions to aid respondents’ memory and facilitate EHC completion. At least for the questions we repeated, this model also allowed us to validate the accuracy of the EHC data.
We examined a random sample of surveys and EHCs for individual respondents to validate the accuracy of two demographic items: marital status and last year attended high school. Respondents’ answers to marital status matched 99.5 percent of the time. Answers to last year attended high school matched 94.0 percent of the time. For the remaining six percent of cases in which last year attended high school did not match, most reports were off by a single year. Respondents may have been confused by the process since last year attended high school was the first item on the self-administered EHC and this survey methodology is novel. This could explain the higher rate of mismatched responses.
From our experience developing and fielding a self-administered EHC, five lessons emerged.
- The self-administered EHC format must be user friendly. Questions and response choices should be presented in full whenever possible. Clear layout and directional cues should be incorporated to facilitate respondent completion. Write-in locations should be easy to find and provide ample space for responses.
- Visual aids provide critical assistance to respondents, especially for confusing or complicated question sets. Visual aids are crucial when a question is not printed in full on the EHC. The slides should show a plausible example that demonstrates how questions should be answered and how responses should be documented.
- A well-trained proctor should lead the respondents through the EHC administration. The proctor must be attuned to the group dynamics and be able to adjust the pace as needed.
- Trained survey administration assistants should be positioned among the respondents to address individual questions. This allows the proctor to keep the group working on the EHC while individual questions are answered. The number of assistants will vary with the EHC complexity and the group size. We were able to administer the modified EHC in groups of up to 200 with one lead proctor and two assistants.
- Ease into the EHC. We found that six percent of respondents incorrectly answered the first question on the EHC; perhaps a sample question, additional instructions, or more respondent training prior to the first item would have improved the dataset.
It is not clear that our modified EHC could be totally self-administered. A fully self-administered EHC might be possible for some topic areas; for example, our items on basic demographics, family events, and education history likely could have been totally self-administered. Additional research efforts that incorporate a larger number of comparable survey questions and EHC questions will help us better understand the accuracy of self-administered EHCs as data collection tools.
The author is grateful to the RAND Survey Research Group for providing support for this article; Bernard Rostker, Jacob Klerman, and Jennifer Hawes-Dawson for their encouragement and mentorship; Jennifer Pevar and Erica Czaja for helpful feedback and dedicated research assistance; LTC Sonya J. Cable for coordinating the pilot testing and live data collection; and the soldiers from Fort Jackson, South Carolina, who helped demonstrate that these data can be collected in a group session.