Behavior Coding Using Computer Assisted Audio Recording: Findings from a Pilot Test

Joanne Pascale


Behavior coding, a pretesting method that involves the systematic application
of standardized codes to behaviors that interviewers and respondents display
during the question/response process, has recently been enhanced by the use
of Computer Audio Recorded Interviewing (CARI) system. Traditionally,
interviewers used cassette tape recorders to capture a relatively small number
of interviewers. CARI is built in to the CATI/CAPI instrument and allows for
digital capture of all interviews (notwithstanding consent from the respondent).
The Census Bureau piloted the use of CARI for evaluation of the 2010 American
Community Survey (ACS) Content Test. This test was carried out to evaluate
changes to existing questions that were designed to enhance data quality, and
to evaluate alternative ways of asking about new topic areas. Interviews (or
designated subsets of interviews) were recorded and coded in a total of 1,427
households. The coded interviews were conducted in both English and Spanish
(1,092 and 335 cases, respectively) and they were recorded in CATI and CAPI
modes (726 and 701 cases, respectively). This paper will provide highlights of the
findings from behavior coding on a number of topic areas. It will also highlight
the unique enhancements offered by CARI, such as: (1) dramatically increasing
the volume and consequent range and diversity of recordings for more targeted
analysis; (2) allowing researchers to listen to interviews during data collection
in order to tailor behavior codes for the project; (3) allowing researchers to
monitor the behavior coding operation in real time for quality assurance;
(4) allowing the coder to see the actual screen display as the interviewer saw
it when asking the questions; (5) allowing the coder to see the data keyed in
to the instrument, enabling the identification and tabulation of keying errors;
and (6) allowing behavior coding of both CATI and CAPI interviews, thus
lessening the differences in effects of a centralized data collection facility versus
a dispersed field staff not accustomed to regular monitoring and coaching.

Full Text: PDF HTML

About Survey Practice Our Global Partners Disclaimer
The Survey Practice content may not be distributed, used, adapted, reproduced, translated or copied for any commercial purpose in any form without prior permission of the publisher. Any use of this e-journal in whole or in part, must include the customary bibliographic citation and its URL.