Survey data collection explained

With the application of probability sampling in the 1930s, surveys became a standard tool for empirical research in social sciences, marketing, and official statistics.[1] The methods involved in survey data collection are any of a number of ways in which data can be collected for a statistical survey. These are methods that are used to collect information from a sample of individuals in a systematic way. First there was the change from traditional paper-and-pencil interviewing (PAPI) to computer-assisted interviewing (CAI). Now, face-to-face surveys (CAPI), telephone surveys (CATI), and mail surveys (CASI, CSAQ) are increasingly replaced by web surveys.[2] In addition, remote interviewers could possibly keep the respondent engaged while reducing cost as compared to in-person interviewers.[3]

Modes of data collection

The choice between administration modes is influenced by several factors, including 1) costs, 2) coverage of the target population (including group-specific preferences for certain modes[4]), 3) flexibility of asking questions, 4) respondents’ willingness to participate and 5) response accuracy. Different methods create mode effects that change how respondents answer. The most common modes of administration are listed under the following headings.[5]

Mobile surveys

Mobile data collection or mobile surveys is an increasingly popular method of data collection. Over 50% of surveys today are opened on mobile devices.[6] The survey, form, app or collection tool is on a mobile device such as a smart phone or a tablet. These devices offer innovative ways to gather data, and eliminate the laborious "data entry" (of paper form data into a computer), which delays data analysis and understanding. By eliminating paper, mobile data collection can also dramatically reduce costs: one World Bank study in Guatemala found a 71% decrease in cost while using mobile data collection, compared to the previous paper-based approach.[7]

Apart from the high mobile phone penetration,[8] [9] further advantages are quicker response times and the possibility to reach previously hard-to-reach target groups. In this way, mobile technology allows marketers, researchers and employers to create real and meaningful mobile engagement in environments different from the traditional one in front of a desktop computer.[10] [11] However, even when using mobile devices to answer the web surveys, most respondents still answer from home.[12] [13]

SMS/IM surveys

SMS surveys can reach any handset, in any language and in any country. As they are not dependent on internet access and the answers can be sent when its convenient, they are a suitable mobile survey data collection channel for many situations that require fast, high volume responses. As a result, SMS surveys can deliver 80% of responses in less than 2 hours [14] and often at much lower cost compared to face-to-face surveys, due to the elimination of travel/personnel costs.[15] IM is similar to SMS, except that a mobile number is not required. IM functions are available in standalone software, such as Skype, or embedded on websites such as Facebook and Google.

Online surveys

Online (Internet) surveys are becoming an essential research tool for a variety of research fields, including marketing, social and official statistics research. According to ESOMAR online survey research accounted for 20% of global data-collection expenditure in 2006. They offer capabilities beyond those available for any other type of self-administered questionnaire. Online consumer panels are also used extensively for carrying out surveys but the quality is considered inferior because the panelists are regular contributors and tend to be fatigued. However, when estimating the measurement quality (defined as product of reliability and validity) using a multitrait-multimethod approach (MTMM), some studies found a quite reasonable quality[16] [17] and even that the quality of a series of questions in an online opt-in panel (Netquest) was very similar to the measurement quality for the same questions asked in the European Social Survey (ESS), which is a face-to-face survey.[18]

Some studies have compared the quality of face-to-face surveys and/or telephone surveys with that of online surveys, for single questions, but also for more complex concepts measured with more than one question (also called Composite Scores or Index).[19] [20] [21] Focusing only on probability-based surveys (also for the online ones), they found overall that the face-to-face (using show-cards) and web surveys have quite similar levels of measurement quality, whereas the telephone surveys were performing worse. Other studies comparing paper-and-pencil questionnaires with web-based questionnaires showed that employees preferred online survey approaches to the paper-and-pencil format. There are also concerns about what has been called "ballot stuffing" in which employees make repeated responses to the same survey. Some employees are also concerned about privacy. Even if they do not provide their names when responding to a company survey, can they be certain that their anonymity is protected? Such fears prevent some employees from expressing an opinion.[22]

Advantages of online surveys

Key methodological issues of online surveys

These issues, and potential remedies, are discussed in a number of sources.[26] [27]

Telephone

Telephone surveys use interviewers to encourage the sample persons to respond, which leads to higher response rates.[28] There are some potential for interviewer bias (e.g., some people may be more willing to discuss a sensitive issue with a female interviewer than with a male one). Depending on local call charge structure and coverage, this method can be cost efficient and may be appropriate for large national (or international) sampling frames using traditional phones or computer assisted telephone interviewing (CATI). Because it is audio-based, this mode cannot be used for non-audio information such as graphics, demonstrations, or taste/smell samples.

Mail

Depending on local bulk mail postage, mail surveys may be relatively lower cost compared to other modes. The field method tends to be longer - often several months - before the surveys are returned and statistical analysis can begin. The questionnaire may be handed to the respondents or mailed to them, but in all cases they are returned to the researcher via mail. Because there is no interviewer presence, the mail mode is not suitable for issues that may require clarification. However, there is no interviewer bias and respondents can answer at their own convenience (allowing them to break up long surveys; also useful if they need to check records to answer a question). To correct nonresponse bias, extrapolation across waves could be done.[29] Response rates can be improved by using mail panels (members of the panel must agree to participate) and prepaid monetary incentives,[30] but response rates are affected by the class of mail through which the survey was sent.[31] Panels can be used in longitudinal designs where the same respondents are surveyed several times.

Visual presentation of survey questions make a difference in how respondents answer them; with four primary design elements: words (meaning), numbers (sequencing), symbols (e.g. arrow), and graphics (e.g. text boxes). In translated surveys, writing practice (e.g. Spanish words are lengthier and require more printing space) and text orientation (e.g. Arabic is read from right to left) must be considered in questionnaire visual design to minimize data missingness.[32] [33]

Face-to-face

The face-to-face mode is suitable for locations where telephone or mail are not developed. Like the telephone mode, the interviewer presence runs the risk of interviewer bias.

Video interviewing

Video interviewing is similar to face-to-face interviewing except that the interviewer and respondent are not physically in the same location, but are communicating via video conferencing such as Zoom or Teams.

Virtual worlds

Virtual-world interviews take place online in a space created for virtual interaction with other users or players, such as Second Life. Both the respondent and interviewer choose avatars to represent themselves and interact by a chat feature or by real voice audio.

Chatbots

A chatbot is used regularly in marketing and sales to gather experience feedback. When used for collecting survey responses, chatbot surveys should be kept short, trained to speak in a friendly human tone, and use easy-to-navigate interface with more advanced Artificial Intelligence.[34]

Mixed-mode surveys

Researchers can combine several above methods for the data collection. For example, researchers can invite shoppers at malls, and send willing participants questionnaires by emails. With the introduction of computers to the survey process, survey mode now includes combinations of different approaches or mixed-mode designs. Some of the most common methods are:[35]

See also

Notes and References

  1. Book: Vehovar. V. . Lozar Manfreda. K. . 2008 . Overview: Online Surveys . The SAGE Handbook of Online Research Methods . N.. Fielding . R. M.. Lee . G.. Blank . 177–194 . London. SAGE . 978-1-4129-2293-7.
  2. Book: Bethlehem. J. . Biffignandi. S. . 2012 . Handbook of Web Surveys . Wiley Handbooks in Survey Methodology . 567 . New Jersey. John Wiley & Sons . 978-1-118-12172-6.
  3. Cook . Sarah . Sha . Mandy . 2016-03-15 . Technology options for engaging respondents in self-administered questionnaires and remote interviewing . RTI Press . en-US . 10.3768/rtipress.2016.op.0026.1603 . free.
  4. Agley. Jon. Meyerson. Beth. Eldridge. Lori. Smith. Carriann. Arora. Prachi. Richardson. Chanel. Miller. Tara. February 2019. Just the fax, please: Updating electronic/hybrid methods for surveying pharmacists. Research in Social and Administrative Pharmacy. en. 15. 2. 226–227. 10.1016/j.sapharm.2018.10.028. 30416040. 53281364 .
  5. Book: Mellenbergh, G.J. . Gideon J. Mellenbergh . 2008 . Surveys . H.J.. Adèr . Herman J. Adèr . G.J.. Mellenbergh . Advising on Research Methods: A consultant's companion . 183–209 . Huizen, The Netherlands. Johannes van Kessel Publishing . 978-90-79418-01-5.
  6. Web site: Mobile-ready. Event driven. Feature rich. Online customer surveys . . https://web.archive.org/web/20151023231506/http://www.questback.com/online-customer-surveys . 23 October 2015 . live .
  7. Web site: Schuster. Christian. Perez Brito. Carlos. Evaluating Cash Transfers in Guatemala . Magpi. 27 November 2016.
  8. Revilla, M., Toninelli, D., Ochoa, C., and G. Loewe (2015). “Who has access to mobile devices in an online opt-in panel? An analysis of potential respondents for mobile surveys”. In D. Toninelli, R. Pinter, and P. de Pedraza (eds), Mobile Research Methods: Opportunities and challenges of mobile research methodologies, pp. 119-139 (Chapter 8). London: Ubiquity Press. . DOI: https://dx.doi.org/10.5334/bar.h. License: CC-BY 4.0.
  9. Do You Know Which Device Your Respondent Has Used to Take Your Online Survey?. Mario. Callegaro. 3 October 2013. Survey Practice. 3. 6. www.surveypractice.org.
  10. Web site: Mobile engagement becomes standard operating procedure . Survey Anyplace . dead . https://web.archive.org/web/20140208043428/http://surveyanyplace.com/why-mobile-surveys/ . 2014-02-08 .
  11. Reaching the Mobile Respondent: Determinants of High-Level Mobile Phone Use Among a High-Coverage Group . Social Science Computer Review . 28 . 3 . 336–349 . 10.1177/0894439309353099. 2010 . Burger . Christoph . Riemer . Valentin . Grafeneder . Jürgen . Woisetschläger . Bianca . Vidovic . Dragana . Hergovich . Andreas . 61640965 .
  12. Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference?. Aigul. Mavletova. Mick P.. Couper. 22 November 2013. Survey Research Methods. 7. 3. 191–205. 10.18148/srm/2013.v7i3.5458.
  13. Toninelli . D. . Revilla . M. . 2016 . Smartphones vs PCs: Does the Device Affect the Web Survey Experience and the Measurement Error for Sensitive Topics? A Replication of the Mavletova & Couper's 2013 Experiment . Survey Research Methods . 10 . 2. 153–169 . 10.18148/srm/2016.v10i2.6274 .
  14. Web site: Global . OnePoint . SMS surveys . 27 June 2016 . OnePoint Global.
  15. Web site: Selanikio . Joel . Getting More Data for Less Money . 9 November 2016 . Magpi.
  16. Quality of Different Scales in an Online Survey in Mexico and Colombia. Melanie. Revilla. Carlos. Ochoa. 14 December 2015. Journal of Politics in Latin America. 7. 3. 157–177. journals.sub.uni-hamburg.de. 10.1177/1866802X1500700305. 56357343. free. 10230/28347. free.
  17. Book: Revilla, M., and W.E. Saris (2015). "Estimating and comparing the quality of different scales of an online survey using an MTMM approach". In Engel, U. (Ed), Survey Measurements: Techniques, Data Quality and sources of Error. Chapter 5, pp. 53-74. Campus. Frankfurt. New York. ISBN 9783593502809. Available at press.uchicago.edu.
  18. Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey?. Melanie. Revilla. Willem. Saris. Germán. Loewe. Carlos. Ochoa. 26 May 2015. International Journal of Market Research. 57. 3. 395–412. 10.2501/IJMR-2015-034. 167732979.
  19. Revilla, M. (2015). “Comparison of the quality estimates in a mixed-mode and a unimode design: an experiment from the European Social Survey”, Quality and Quantity. 2015, 49(3): 1219-1238. Published online first 13 of June 2014. DOI: 10.1007/s11135-014-0044-5
  20. Measurement invariance and quality of composite scores in a face-to-face and a web survey. Melanie A.. Revilla. 30 December 2012. Survey Research Methods. 7. 1. 17–28. 10.18148/srm/2013.v7i1.5098.
  21. Quality in Unimode and Mixed-Mode designs: A Multitrait-Multimethod approach. Melanie. Revilla. 31 December 2010. Survey Research Methods. 4. 3. 151–164. 10.18148/srm/2010.v4i3.4278.
  22. Book: Schultz & Schultz, Duane. Psychology and work today. 2010. Prentice Hall. New York. 978-0-205-68358-1. 40.
  23. Book: Dillman, D.A. . 2006 . Mail and Internet Surveys: The Tailored Design Method . 2nd . New Jersey. John Wiley & Sons . 978-0-470-03856-7.
  24. Wright. Kevin. Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services. Journal of Computer-Mediated Communication. 1 April 2005. 10. 3. 1034. 6 March 2018.
  25. Dwivedi . Yogesh K. . Ismagilova . Elvira . Hughes . D. Laurie . Carlson . Jamie . Filieri . Raffaele . Jacobson . Jenna . Jain . Varsha . Karjaluoto . Heikki . Kefi . Hajer . Krishen . Anjala S. . Kumar . Vikram . Rahman . Mohammad M. . Raman . Ramakrishnan . Rauschnabel . Philipp A. . Rowley . Jennifer . 2021-08-01 . Setting the future of digital and social media marketing research: Perspectives and research propositions . International Journal of Information Management . 59 . 102168 . 10.1016/j.ijinfomgt.2020.102168 . 0268-4012. free . 10454/18041 . free .
  26. Salant, Priscilla, and Don A. Dillman. "How to Conduct your own Survey: Leading professional give you proven techniques for getting reliable results." (1995).
  27. Kalton, Graham. Introduction to survey sampling. Vol. 35. Sage, 1983.
  28. Book: Groves, R.M. . 1989 . Survey Costs and Survey Errors . New York. Wiley . 978-0-471-67851-9.
  29. Estimating Nonresponse Bias in Mail Surveys . J. Scott Armstrong and Terry S. Overton . Journal of Marketing Research . 14 . 3 . 396–402 . 1977 . 10.2307/3150783 . dead . https://web.archive.org/web/20100620200022/http://marketing.wharton.upenn.edu/ideas/pdf/Armstrong/EstimatingNonresponseBias.pdf . 2010-06-20 . 3150783 . 10.1.1.36.7783 .
  30. Monetary Incentives in Mail Surveys . J. Scott Armstrong. Public Opinion Quarterly . 39 . 111–116 . 1975 . 10.1086/268203. 146397107 .
  31. Class of Mail Does Affect Response Rates to Mailed Questionnaires: Evidence from Meta-Analysis (with a Reply by Lee Harvey). J. Scott Armstrong. Journal of the Market Research Society . 32 . 469–472 . 1990.
  32. Wang . Kevin . Sha . M. Mandy . 2013-03-01 . A Comparison of Results from a Spanish and English Mail Survey: Effects of Instruction Placement on Item Missingness . Survey Methods: Insights from the Field (SMIF) . en-US . 10.13094/SMIF-2013-00006 . 2296-4754 . free.
  33. Book: Pan, Yuling . The Sociolinguistics of Survey Translation . Sha . Mandy . 2019-07-09 . . 978-0-429-29491-4 . London . 10.4324/9780429294914/sociolinguistics-survey-translation-yuling-pan-mandy-sha-hyunjoo-park.
  34. Book: Dandapani, Arundati . The Essential Role of Language in Survey Research . 2020-04-30 . RTI Press . 978-1-934831-24-3 . Sha . Mandy . 221–230 . Redesigning Conversations with Artificial Intelligence (Chapter 11) . 10.3768/rtipress.bk.0023.2004 . free.
  35. Book: Groves. R.M. . Fowler. F. J. . Couper. M.P. . Lepkowski. J.M. . Singer. E. . Tourangeau. R. . 2009 . Survey Methodology . New Jersey. John Wiley & Sons . 978-1-118-21134-2.