The Prospects of Computer Assisted Self-Interviewing in Group Settings in Developing Countries: A Comparison Between a Tablet- and Paper-Based Survey Among Secondary School Teachers in Côte D’Ivoire and Kenya

Line Kuppens Institute of Development Policy and Management
University of Antwerp
Centre for Research on Peace and Development
KU Leuven

Arnim Langer Centre for Research on Peace and Development
KU Leuven


This paper assesses the advantages of switching from paper-and-pencil self-administered questionnaires (PAP-SA) to computer-assisted self-interviews (CASI) in the context of group-administered surveys in developing countries. In order to highlight the advantages of switching from a PAP-SA to a CASI approach, we compare two surveys, which the authors conducted among secondary school teachers in Côte d’Ivoire and Kenya, respectively. In addition to the advantages, which have been associated with the use of interviewer-assisted surveys in Western contexts, we found that the use of tablets was received enthusiastically by our Kenyan respondents, even by respondents who had no prior experience using tablets. The quality and duration of our Kenyan survey was not compromised either by respondents’ lack of prior exposure to electronic devices, such as tablets.


About 20 years ago, scholars foresaw a rapid shift from traditional paper-and-pencil questionnaires to computer-assisted surveys, conducted with the help of mobile phones, Personal Digital Assistants (PDAs), computers or tablets. This shift was argued to be inevitable given the important advantages which computer-assisted surveys had over paper-based questionnaires, including a reduction of data collection and entry errors, more flexibility in terms of the routing of the questionnaire, and the possibility to include a range of consistency checks (Byass et al. 2008; Fletcher et al. 2003).

While computer-assisted surveys have become the norm in Western or developed settings, paper-and-pencil questionnaires remain quite common in developing countries (Caeyers et al. 2012). The shift to computer-assisted surveys in developing countries has been much slower because of climatological and logistical challenges, as well as respondents’ potential lack of exposure to technology (Hewett et al. 2004). However, in the field of health-related surveys, computer-assisted personal interviewing (CAPI) has recently gained much ground in developing countries (e.g., Engelbrecht et al. 2016; Simbayi et al. 2007). So far, there is a relatively small number of studies which have examined the impact of this switch in these specific settings. Most of these studies have focused on CAPI surveys, while only a few of them have analyzed the impact of computer-assisted self-interviewing (CASI). (For exceptions, see Hewett et al. 2004; Jaspanet al. 2007; van de Wijgert et al. 2000).

The current paper aims to contribute to the literature on CASI surveying in developing countries by comparing a paper-and-pencil self-administered (PAP-SA) survey concerning teachers’ perceptions and attitudes toward dealing with their country’s violent past in the classroom with a tablet-based version of this survey. While the PAP-SA survey was conducted among secondary school teachers in Côte d’Ivoire, the CASI survey was conducted among secondary school teachers in Kenya. Innovatively, the surveys were administered in a group setting: teachers gathered in the staff room to respond (individually) to the survey. Group-administered surveys are a useful strategy for collecting data within institutional contexts, such as schools, because they can achieve high response rates, while generally requiring only a small number of survey administrators and supervisors (Dörnyei and Taguchi 2009).

In the next section, we briefly review the advantages of computer-assisted surveys. After providing background information on both our surveys, the paper continues by comparing the paper-based to the tablet-based survey. We then conclude.

Advantages of Computer-Assisted Surveys

The advantages of computer-assisted surveys appear to be the same in developing and developed countries and include higher item response rates and overall more complete datasets, better adherence to sampling protocols, automatic saving of location and time, and the reduction or even elimination of data recording and entry errors (e.g., Caviglia-Harris et al. 2012; Gwaltney et al. 2008; Marcano Belisario et al. 2015; Zhang et al. 2012). Moreover, survey software can detect out-of-range and ambiguous responses and offers possibilities to include complex skip patterns as well as response requests to avoid missing data (Caeyers et al. 2012). The immediate uploading of data also improves data security and allows for better monitoring and supervision of interviewers (Leisher 2014). Other advantages include the increased reporting of sensitive topics (e.g., Gnambs and Kaspar 2015; Hewett et al. 2004; Jaspan et al. 2007). Although it is expensive to acquire the necessary electronic devices for a CAPI survey, these expenses tend to be recovered by the savings that are usually made in terms of data entry and cleaning (King et al. 2013; Zhang et al. 2012). Similarly, a CASI survey tends to be more cost-effective than a PAP-SA survey in cases where the electronic devices are being used for multiple surveys or for large-scale surveys (Brown et al. 2008). Research on the impact of computer-assisted surveys on interview duration shows a mixed picture. While a small number of studies has reported significantly shorter interviews (see e.g., Caeyers et al. 2012; Leisher 2014), other studies report no significant change in interview duration (King et al. 2013; Zhang et al. 2012). The interview duration effects of using CASI surveying has so far not been systemically analyzed.

Research has shown that survey respondents in developing countries tend to react favorably to computer-assisted surveys, including CASI surveying. Not only do respondents in developing countries consider a CASI survey to be more user-friendly, but they also prefer it to a PAP-SA survey (Hewett et al. 2004; Jaspan et al. 2007; van de Wijgert et al. 2000). Unsurprisingly, the correct use of electronic devices is strongly correlated with the level of education: i.e., less educated respondents make more invalid entries than highly educated respondents (Hewett et al. 2004; van de Wijgert et al. 2000). Moreover, a CASI survey may at times draw suspicion from people unfamiliar with this type of data collection (Cheng et al. 2011). In rural Kenya, for example, survey researchers using laptops were accused of espionage (Mensch et al. 2003).

The Current Study

The current paper is based on two group-administered surveys, which the authors conducted among secondary school teachers in Kenya and Côte d’Ivoire. The objective of the surveys was to determine teachers’ perceptions of their countries’ violent past as well as their attitudes and practices toward dealing with the past in their classrooms. The Ivorian survey was paper-based and was conducted in Abidjan from February to April 2015. The Kenyan survey took place in Nairobi from May to June 2016 and was conducted on tablets (Samsung S5). We used the well-known survey application Qualtrics (Qualtrics LLC, Provo, UT), which allowed for the offline collection of data. While the surveys were largely similar, where necessary, the questionnaires were amended to reflect country-specific circumstances.

The surveys clustered teachers within schools. The sampling frame was based on a list of official secondary schools in Abidjan (429) and Nairobi (258). After stratification by municipality, we sorted the list by population size of the municipality, and by number of pupils and teachers per school. From the sampling frame, we systematically selected 80 schools in Abidjan and 64 in Nairobi. Each of the selected schools was visited by the research team prior to the survey in order to request permission to conduct a survey among the school’s teachers. Permission was obtained from all the selected schools. A second visit to each of the schools was subsequently scheduled in order to conduct the actual survey. At this stage, three schools in Abidjan dropped out. Within each school, all teachers were invited to participate (2,412 in Abidjan; 1,344 in Nairobi). However, only the teachers who were present on the agreed day effectively participated in our survey. If less than one third (Abidjan) or less than half (Nairobi) of the teachers were present, we organized a second survey day. In total, 984 Ivorian and 925 Kenyan teachers participated in our surveys (i.e., response rates of respectively 0.40 and 0.69).1 We agreed that teachers would gather in the staff room during their break. At this stage, each of them was given a paper questionnaire in the case of Côte d’Ivoire and a tablet in the case of Kenya. Only a handful of teachers refused to participate because of a heavy workload, and hence, our cooperation rates were nearly a 100% in both cases. One of the researchers was always present to provide assistance, if necessary.

The paper-based survey presented several questions per page (see questionnaire in Appendix). To enhance readability, the questions were put in bold, and response options were printed alternately on a blank/gray background. To respond, teachers had to circle the number corresponding to their response. Writing was limited to one open question and to providing a written response if the “Other”-option was selected by respondents.

The tablet survey showed one question at a time, and only important aspects of a question were put in bold (see screenshot in Appendix). Teachers could select their response by touching the screen or by typing in their response with the help of a pop-up keyboard. Both automatic routing and constructive error messages were integrated into the survey design. The survey also automatically recorded response times.

Allowing more complex designs, the tablet survey also included two vignette experiments and one list experiment. Each vignette described a hypothetical situation at school involving pupils and/or teachers (e.g., a fight between pupils or the marking of an essay). The names of the pupils mentioned in the vignettes were intended to reflect specific ethnic (4 variations) or religious (2 variations) group identities. In order to assess to what extent teachers discriminate between pupils from different ethnic backgrounds, the two vignettes were randomly allocated to different teachers. Our list experiment was aimed at determining to what extent teachers gave preferential treatment to pupils from their own ethnic group. Although hypothetically the same experiments could have been conducted in our survey in Côte d’Ivoire, it would have been a much more cumbersome procedure to prepare and properly allocate the surveys, particularly given that the surveys were printed and assembled locally.

Comparing a Paper- to a Tablet-Based Self-Interview in a Group Setting

In this section, we will compare both surveys. We will not apply statistical tests given the differences between the two sample populations and because of the contextualization of the questionnaires.

Survey Practice

In order to avoid discussions among teachers and to increase participation, teachers had to respond to the survey on the spot. Teachers were more inclined to do this with the tablet survey compared to the paper version. In the latter case, many teachers asked whether they could take the questionnaire home and complete it at a later stage. Teachers also appeared to enjoy the tablet survey considerably more. Illustratively, one teacher spontaneously confirmed during a follow-up interview: “The survey was good, it felt like my colleagues were happy. They were saying that it’s actually something that they have never seen (cf. tablet-based surveys)” (Female teacher, 33 years).2 This enthusiasm may also partly explain why a relatively higher number of teachers in Kenya (795) was willing to participate in our follow-up research compared to Côte d’Ivoire (566). While teachers who were not familiar with the use of tablets initially required a bit of time to get used to the devices, they easily caught up after some help by the research team. By daily uploading the data to the cloud, data security was also much better in the case of Kenya.

We rented 30 tablets from a local research institute at a rate of 300KES ($2.95) a day.3 The tablets were rented for 5 weeks, yielding a total cost of $3,657. The license to use Qualtrics was university-subsidized. The costs in Côte d’Ivoire were much lower: $257 was spent on printing the questionnaires, while the costs for sending the questionnaires back to our home university in Belgium amounted to $254. While the renting of tablets was relatively expensive, their use significantly reduced time and efforts spent on data entry and cleaning. While data entry and checking in the case of Côte d’Ivoire amounted to a total of about 8 weeks (which was conducted by three people in order to save time), in the case of Kenya, the data was uploaded daily and it only took two days to clean the data. The tablet survey did however require one week to program and test the questionnaire, and to prepare the tablets.4

Data Quality

Item nonresponse was low in the tablet-based survey because of the integration of automatic routing and constructive error messages. Most questions even included a forced response notification (a neutral response option was provided).5 Item nonresponse was highest (79 respondents; 8.5%) on the open-ended question. By comparison, 177 Ivorian teachers (18%) did not respond to that question. Overall, the Ivorian data was less complete. Only 18.4% of teachers responded to all items.6 For most teachers (58.9%), there were no more than 5 missing values; nevertheless, 70.8% of teachers had no more than 10 missing values. Item nonresponse was largely due to wrong entries, such as responding on the wrong line or circling several responses instead of one. Some teachers did skip entire pages, however.

Another major advantage of the tablet survey was the improved readability of the answers provided to the open questions. This is especially relevant for follow-up research: 566 Ivorian teachers were willing to participate in a follow-up study and hence provided their e-mail addresses. Unfortunately, 120 email addresses were basically unusable because of poor handwriting.

Interestingly, CASI also prevents respondents from a posteriori changing responses. This is particularly useful when the survey includes questions or experiments that are susceptible to a social desirability bias.

Research Opportunity

The direct uploading of data collected via CASI provides immediate access to the data. This in turn allows the researcher to conduct exploratory analyses on the spot. This is particularly interesting for multi- or mixed-method fieldwork. We used such preliminary results, for instance, in our follow-up interviews in order to gain more understanding of the observed response patterns.

Discussion and Conclusion

While computer-assisted surveys are the standard practice in developed contexts, they are only recently gaining ground in developing countries. This delayed change is largely due to researchers’ reluctance to use technological devices given the particular challenges in these settings. Yet, as in developed countries, computer-assisted surveys have also important advantages over paper-based surveys in developing countries. The current paper contributes to this literature by comparing a paper- to a tablet-survey conducted respectively in Abidjan, Côte d’Ivoire, and Nairobi, Kenya. The paper differentiates itself from earlier studies because of the way the data was collected (i.e., instead of administrating the survey individually, we conducted our survey in a group setting) and because of the topic under investigation (i.e., instead of a health-related topic, we offer an example from the social sciences).

Although the findings of this paper are limited due to its nonexperimental nature and potential differences between the two teacher populations, the results are promising and in accordance with previous studies. The design of the tablet survey basically eliminated item nonresponse, while the typing of responses to open questions also eliminated issues of poor handwriting. This is particularly important when one is collecting e-mail addresses for follow-up communication and research. Further, the automatic routing prevented teachers from a posteriori altering responses, which is of importance when a survey includes questions which are likely to be subject to social desirability effects. The tablet survey also proved more practical. It significantly reduced efforts of data entry and cleaning, and teachers appeared to be more enthusiastic to participate. Moreover, teachers who were not familiar with a tablet mastered it very quickly. Finally, data security was also improved due to the daily uploading of data to the cloud. The immediate availability of the collected data also allowed for the conduct of preliminary analyses which in turn could inform follow-up interviews.

While there are certain impracticalities associated with using tablets, including the charging of batteries or the risk of theft, the advantages clearly seem to outweigh these relatively minor inconveniences. Although this study was concerned with a group of relatively highly educated respondents, exposure to technology is rapidly increasing in the developing world, which means that the prospects for digital self-administered surveys look promising.


The authors would like to thank the Belgian Development Cooperation, and in particular VLIR-UOS, for their support for this research (grant reference 2014-001-147). VLIR-UOS supports partnerships between universities and university colleges in Flanders (Belgium) and the South looking for innovative responses to global and local challenges.


Brown et al. 2008
Brown, J.L., P.A. Vanable, and M.D. Eriksen. 2008. Computer-assisted
self-interviews: A cost effectiveness analysis. Behavior Research Methods 40(1): 1–7.
Byass et al. 2008
Byass, P., S. Hounton, M. Ouédraogo, H. Some, I. Diallo, E. Fottrell, A. Emmelin and N. Meda. 2008. Direct data capture using hand-held computers in rural Burkina Faso: experiences, benefits and lessons learnt. Tropical Medicine and International Health 13(1): 25–30.
Caeyers et al. 2012
Caeyers, B., N. Chalmers and J. De Weerdt. 2012. Improving consumption measurement and other survey data through CAPI: evidence from a randomized experiment. Journal of Development Economics 98(1): 19–33.
Caviglia-Harris et al. 2012
Caviglia-Harris, J., S. Hall, K. Mulllan, C. Macintyre, S.C. Bauch, D. Harris, E. Sills, D. Roberts, M. Toomey and H. Cha. 2012. Improving household surveys through computer-assisted data collection: use of touch-screen laptops in challenging environments. Field Methods 24(1): 74–94.
Cheng et al. 2011
Cheng, K.G., F. Ernesto, R.E. Ovalle-Bahamón and K.N.Truong. 2011. Barriers to acceptance of personal digital assistants for HIV/AIDS data collection in Angola. International Journal of Medical Informatics 80(8): 579–585.
Dörnyei and Taguchi 2009
Dörnyei, Z. and T. Taguchi. 2009. Questionnaires in second language research: construction, administration, and processing. Routledge, New York.
Engelbrecht et al. 2016
Engelbrecht, M., A.J. van Rensburg, G. Kigozi and HCJ (Dingie) van Rensburg. 2016. Factors associated with good TB infection control practices among primary healthcare workers in the free state province, South Africa. BMC Infectious Diseases 16(1): 633. DOI: 10.1186/s12879-016-1984-2.o
Fletcher et al. 2003
Fletcher, L.A., D.J. Erickson, T.L. Toomey, A.C. Wagenaar. 2003. Handheld computers a feasible alternative to paper forms for field data collection. Evaluation Review 27(2): 165–178.
Gnambs and Kaspar 2015
Gnambs, T. and K. Kaspar. 2015. Disclosure of sensitive behaviors across self-administered survey modes: a meta-analysis. Behavior Research Methods 47(4): 1237–1259. DOI: 10.3758/s13428-014-0533-4.
Gwaltney et al. 2008
Gwaltney, C.J., A.L. Shields and S. Shiffman. 2008. Equivalence of electronic and paper-and-pencil administration of patient-reported outcome measures: a Meta-analytic review. Value in Health 11(2): 322–333.
Hewett et al. 2004
Hewett, P.C., A.S. Erulkar and B.S. Mensch. 2004. The feasibility of computer-assisted survey interviewing in Africa experience from two rural districts in Kenya. Social Science Computer Review 22(3): 319–334. DOI: 10.1177/0894439304263114.
Jaspan et al. 2007
Jaspan, H.B., A.J. Flisher, L. Myer, C. Mathews, C. Seebregts, J.R. Berwick, R. Wood and L.-G. Bekker. 2007. Brief report: methods for collecting sexual behavior information from South African adolescents – a comparison of paper versus personal digital assistant questionnaires. Journal of Adolescence 30(2): 353–359.
King et al. 2013
King, J.D., J. Buolamwini, E.A. Cromwell, A. Panfel, T. Teferi, M. Zerihun, B. Melak, J. Watson, Z. Tadesse, D. Vienneau, J. Ngondi, J. Utzinger, P. Odermatt and P.M. Emerson. 2013. A novel electronic data collection system for large-scale surveys of neglected tropical diseases. PLoS ONE 8(9): 1–11. DOI: 10.1371/journal.pone.0074570.
Leisher 2014
Leisher, C. 2014. A comparison of tablet-based and paper-based survey data collection in conservation projects. Social Sciences 3: 264–271. DOI: 10.3390/socsci3020264.
Marcano Belisario et al. 2015
Marcano Belisario, J.S., J. Jamsek, K. Huckvale, J. O’Donoghue, C.P. Morrison and J. Car. 2015. Comparison of self-administered survey questionnaire responses collected using mobile apps versus other methods. Cochrane Database of Systematic Reviews (7): 1–113. DOI: 10.1002/14651858.MR000042.pub2.
Mensch et al. 2003
Mensch, B.S., P.C. Hewett and A.S. Erulkar. 2003. The reporting of sensitive behavior by adolescents: a methodological experiment in Kenya. Demography 40(2): 247–268.
Simbayi et al. 2007
Simbayi, L.C., S. Kalichman, A. Strebel, A. Cloete, N. Henda and A. Mqeketo. 2007. Internalized stigma, discrimination, and depression among men and women living with HIV/AIDS in Cape Town, South Africa. Social Science & Medicine 64(9): 1823–1831.
van de Wijgert et al. 2000
Van de Wijgert, J., N. Padian, S. Shiboski and C. Turner. 2000. Is audio computer-assisted self-interviewing a feasible method of surveying in Zimbabwe? International Journal of Epidemiology 29(5): 885–890.
Zhang et al. 2012
Zhang, S., Q. Wu, M.H.M.M.T. van Velthoven, L. Chen, J. Car, I. Rudan, Y. Zhang, Y. Li, R.W. Scherpbier. 2012. Smartphone versus pen-and-paper data collection of infant feeding practices in rural China. Journal of Medical Internet Research 14(5): 1–12. DOI: 10.2196/jmir.2183.


Extract from the Ivorian questionnaire.

Dans quelle mesure est-ce que vous êtes d’accord avec les affirmations suivantes ?
Veuillez entourer le numéro correspondant à votre réponse:
Tout à fait en désaccord En dés- accord Ni d’accord, ni en désaccord D’accord Tout à fait d’accord
L’école est l’endroit idéal pour apprendre la paix et la tolérance; 1 2 3 4 5
L’école joue un rôle important dans le processus de la réconciliation nationale; 1 2 3 4 5
L’école peut changer le comportement des élèves positivement; 1 2 3 4 5
La diversité à l’école contribue à démentir les préjugés; 1 2 3 4 5
Les amis et les parents influencent plus le comportement d’un élève que l’école; 1 2 3 4 5
La rencontre de différents groupes ethniques et religieux à l’école entraîne des tensions; 1 2 3 4 5

Extract from the Kenyan questionnaire

SP-Vol 10_Kuppens_fx1.jpg

1 Response rate 1 of the American Association for Public Opinion Research guidelines.
2 The follow-up interviews (18) focused on content rather than on survey methodology.
3 Historic currency conversion.
4 Because of the substantial differences in price-levels between Abidjan and Nairobi, we do not compare the overall expenses of both surveys.
5 Requesting response is, however, a better way to avoid teachers from proceeding too fast to the next question by mistakenly pushing the ‘next’ button.
6 Excluding specification of the response option ‘Other’, as well as the open-ended question.

About Survey Practice Our Global Partners Disclaimer
The Survey Practice content may not be distributed, used, adapted, reproduced, translated or copied for any commercial purpose in any form without prior permission of the publisher. Any use of this e-journal in whole or in part, must include the customary bibliographic citation and its URL.