Cognitive pretesting explained

Cognitive pretesting, or cognitive interviewing, is a field research method where data is collected on how the subject answers interview questions. It is the evaluation of a test or questionnaire before it's administered.[1] It allows survey researchers to collect feedback regarding survey responses and is used in evaluating whether the question is measuring the construct the researcher intends. The data collected is then used to adjust problematic questions in the questionnaire before fielding the survey to the full sample of people.[2] [3] [4]

Cognitive interviewing generally collects the following information from participants: evaluations on how the subject constructed their answers; explanations on what the subject interprets the questions to mean; reporting of any difficulties the subject had in answering the questions; and anything else that reveals the circumstances to the subject's answers.

Cognitive pretesting is considered essential in testing the validity of an interview, test, or questionnaire.[5]

Purpose

The purpose of these pretests is to:

Types

In general, there are many methods practiced when conducting a cognitive pretest. Including: conventional pretesting, cognitive interviewing, behavior coding, respondent debriefing, group discussion, expert review, eye tracking, and web probing.

Conventional pretesting-This is similar to a rehearsal that tries to imitate and model after what the real test or interview will be like. A simulation of real test or interview that takes place prior to the real one. Whatever method used in the actual interview or test should be used in this method of pretesting.

Cognitive pretesting (cognitive interviewing)- very similar to conventional pretesting. However, the participants are actively being asked about the questions as they take the test. It's conducted during the interview or test.

They can also be presented in multiple different ways including: written surveys, oral surveys, electronic surveys[3]

Techniques

There are certain techniques that the interviewer implements in cognitive pretesting to extract the information needed to ensure a good interview or questionnaire.

The think-aloud technique- This occurs when the interviewer asks the interviewee to vocalize their thoughts and how they came to their answer. This can be concurrent (during) or retrospective (after) the interview.[6]

Probing technique- This occurs when the interviewer asks the interviewee one or more follow-up questions. They 'probe' about the questions asked, terminology used, or even the responses. Probes can be concurrent (during the task but not to be disruptive of the task) or retrospective (after the task).

Paraphrasing- This occurs when the interviewer asks the interviewee to use their own words to repeat the question. This tests to make sure the questions are understandable.

Confidence rating- This occurs when the interviewer asks the interviewee about their confidence in how correctly they answered the question.

Sorting or Card Sorting- This occurs when the interviewer asks the interviewee or tries to understand how the interviewee categorizes certain situations or even terms.

Vignettes- These are short descriptions of one or more hypothetical characters (similar to vignette used in psychological and sociological experiments or anchoring vignettes in quantitative survey research) and are used to investigate the respondent's cognitive processing with regard to their survey-relevant decisions.

Web probing- This technique implements cognitive interview probing techniques in web surveys. Its strengths include standardization, anonymity, and large and fast coverage because it is administered via the web. However, web probing can only reach online population groups, there is probe nonresponse, and insufficient probe answers from a content perspective cannot be followed up.[7]

Participants and recruitment

Sample size is a very important topic in pretests. Small samples of 5-15 participants are common. While some researchers suggest that it is best if the sample size is at least 30 people and more is always better,[8] the current best practice is to design the research in rounds to retest changes. For example, when pretesting a questionnaire, it is more useful to conduct 3 rounds of 9 participants than 1 round of 27.[9]

There are two different methods of telling participants about the questionnaire: participating pretests and undeclared pretest.

Cross-cultural research

When conducting cognitive interviews in non-English languages, recent research recommend not restricting sample selection and recruitment to non-English speaking monolinguals, which was a common practice by survey researchers.[10] When recruiting hard-to-reach respondents and respondent characteristics via purposive sampling, community-based recruitment (word of mouth, endorsement from community leaders) works better than advertisements.[11] [12] [13]

Use by survey researchers

Cognitive interviewing is regularly practiced by U.S. Federal Agencies, including the Census Bureau,[14] [15] National Center for Health Statistics (NCHS),[16] and the Bureau of Labor Statistics.[17] The NCHS maintains a database of U.S. and international agencies that have conducted cognitive interview projects and contributed reports to their depository, such as the National Science Foundation and GESIS – Leibniz Institute for the Social Sciences.[18]

Cross-cultural cognitive interviewing is practiced to evaluate survey question equivalence and sources of difficulties, as well as to repair problems related to translation.[19] [20] Because of differences in communication styles and cultural norms, adaptations are needed in protocol setup[21] and design, use of vignettes,[22] and verbal probing.[23]

Standards

In October 2016, the U.S. Office of Management and Budget (OMB) issued Statistical Policy Directive No. 2 Addendum: Standards and Guidelines for Cognitive Interviews that included seven standards for cognitive interviews conducted by or for U.S. Federal studies.https://www.whitehouse.gov/wp-content/uploads/2021/04/final_addendum_to_stat_policy_dir_2.pdf Another standard proposed by researchers is the Cognitive Interviewing Reporting Framework (CIRF) that applies a 10-category checklist to make clear what was done during the cognitive interviews and how conclusions were made based on procedures and results of those interviews.[24] In addition, a project management approach is recommended when managing cognitive interviewing studies.[25] For translated surveys, cognitive interviewing techniques, participant selection and recruitment, and project management approach must be adapted to increase their fit for use.[26]

Notes and References

  1. 10.15465/gesis-sg_en_010. 2016. Lenzner. Timo. Neuert. Cornelia. Otto. Wanda. Kognitives Pretesting. Gesis Survey Guidelines.
  2. Web site: Pretesting - Cross-Cultural Survey Guidelines. 2020-07-18. ccsg.isr.umich.edu.
  3. Web site: Writing@CSU. 2020-07-18. writing.colostate.edu. en.
  4. Babonea . Alina-Mihaela . Voicu . Mirela-Cristina . April 2011 . Questionnaires pretesting in marketing research . Challenges of the Knowledge Society . 1 . 1323–1330 . 2068-7796 . Nicolae Titulescu University Publishing House . Romania .
  5. Web site: GESIS - Leibniz Institute for the Social Sciences. 2020-07-18. www.gesis.org. en.
  6. Tilley. Barbara C.. LaPelle. Nancy R.. Goetz. Christopher G.. Stebbins. Glenn T.. 2014. Using Cognitive Pretesting in Scale Development for Parkinson's Disease: The Movement Disorder Society Unified Parkinson's Disease Rating Scale (MDS-UPDRS) Example. Journal of Parkinson's Disease. 4. 3. 395–404. 10.3233/JPD-130310. 1877-7171. 5086096. 24613868.
  7. Web site: Web Probing . 2023-10-24 . GESIS - Leibniz Institute for the Social Sciences . en.
  8. Perneger. Thomas V.. Courvoisier. Delphine S.. Hudelson. Patricia M.. Gayet-Ageron. Angèle. 22314144. 2015-01-01. Sample size for pre-tests of questionnaires. Quality of Life Research. en. 24. 1. 147–151. 10.1007/s11136-014-0752-2. 25008261. 1573-2649.
  9. Book: Willis, Gordon . Cognitive interviewing: A tool for improving questionnaire design . Sage . 2005 . 9780761928041 . 146.
  10. Park . Hyunjoo . Sha . M. Mandy . Willis . Gordon . November 2016 . Influence of English-language Proficiency on the Cognitive Processing of Survey Questions . Field Methods . en . 28 . 4 . 415–430 . 10.1177/1525822X16630262 . 1525-822X.
  11. Sha . M. Mandy . Park . Hyunjoo . Liu . Lu . 2013-10-01 . Exploring the efficiency and utility of methods to recruit non-English speaking qualitative research participants . Survey Practice . en . 6 . 3 . 10.29115/SP-2013-0015. free .
  12. Park . Hyunjoo . Sha . M. Mandy . 2014-06-01 . Evaluating the Efficiency of Methods to Recruit Asian Research Participants . Journal of Official Statistics . en . 30 . 2 . 335–354 . 10.2478/jos-2014-0020. free .
  13. Sha . Mandy . Moncada . Jennifer . 2017-06-01 . Successful Techniques to Recruit Hispanic and Latino Research Participants . Survey Practice . en . 10 . 3 . 10.29115/SP-2017-0014. free .
  14. Web site: Cognitive Pretesting of 2019 American Housing Survey Modules . Virgile . M. . Katz . J. . Tuttle . D. . Terry . R. . Graber . J. . 2019 . United States Census Bureau . EN-US . live . https://web.archive.org/web/20200808095307/https://www.census.gov/library/working-papers/2019/adrm/rsm2019-07.html . 2020-08-08 . 2020-05-20.
  15. Web site: Childs . Jennifer . Sha . Mandy . Peytcheva . Emilia . Cognitive Testing of the Targeted Coverage Follow-up (TCFU) Interview . October 4, 2023 . Census Working Papers.
  16. Web site: Q-Bank: Question Evaluation for Surveys . 2023-11-05 . wwwn.cdc.gov.
  17. Web site: The American Time Use Survey: cognitive pretesting : Monthly Labor Review: U.S. Bureau of Labor Statistics. K. Schwartz, Lisa. www.bls.gov. en-us. 2020-05-27.
  18. Web site: Explore Reports by Agency - Q-Bank . 2023-11-05 . wwwn.cdc.gov.
  19. Willis . Gordon . May 2, 2015 . The Practice of Cross-Cultural Cognitive Interviewing . . 79 . S1 . 359–395.
  20. Book: Aizpurua, Eva . Pretesting methods in cross-cultural research (Chapter 7) in The essential role of language in survey research . RTI Press . 2020 . 978-1-934831-23-6 . Sha . Mandy . 129–150 . 10.3768/rtipress.bk.0023.2004 . Gabel . Tim . free.
  21. Park . Hyunjoo . Goerman . Patricia . Sha . Mandy . 2017-06-01 . Exploring the Effects of Pre-interview Practice in Asian Language Cognitive Interviews . Survey Practice . en . 10 . 3 . 10.29115/SP-2017-0019. free .
  22. Sha . Mandy . 2016-08-01 . The Use of Vignettes in Evaluating Asian Language Questionnaire Items . Survey Practice . en . 9 . 3 . 10.29115/SP-2016-0013. free .
  23. Mneimneh . Zeina Nazih . 2018-07-25 . Sha . Mandy . Behr . Dorothée . Probing for sensitivity in translated survey questions: Differences in respondent feedback across cognitive probe types . Translation & Interpreting . en . 10 . special issue on translation of questionnaires in cross-national and cross-cultural research . 73–88 . 1836-9324.
  24. Boeije . Hennie . Willis . Gordon . August 2013 . The Cognitive Interviewing Reporting Framework (CIRF) . Methodology . 9 . 3 . 87–95 . 10.1027/1614-2241/a000075 . 1614-1881.
  25. Sha . Mandy . Childs . Jennifer Hunter . 2014-08-01 . Applying a project management approach to survey research projects that use qualitative methods . Survey Practice . en . 7 . 4 . 10.29115/SP-2014-0021. free .
  26. Sha . Mandy . Pan . Yuling . 2013-12-01 . Adapting and Improving Methods to Manage Cognitive Pretesting of Multilingual Survey Instruments . Survey Practice . en . 6 . 4 . 10.29115/SP-2013-0024. free .