CRAAP test explained

The CRAAP test is a test to check the objective reliability of information sources across academic disciplines. CRAAP is an acronym for Currency, Relevance, Authority, Accuracy, and Purpose. Due to a vast number of sources existing online, it can be difficult to tell whether these sources are trustworthy to use as tools for research. The CRAAP test aims to make it easier for educators and students to determine if their sources can be trusted. By employing the test while evaluating sources, a researcher can reduce the likelihood of using unreliable information. The CRAAP test, developed by Sarah Blakeslee and her team of librarians at California State University, Chico (CSU Chico),[1] is used mainly by higher education librarians at universities. It is one of various approaches to source criticism.

History

The test was implemented by Sarah Blakeslee[2] [3] during the spring of 2004 when she was creating a first year workshop to first-year instructors. Blakeslee was frustrated that she could not remember the criteria for looking up different sources. After much thought, she came up with the acronym. She wanted to give students an easier way to determine what sources are credible. One of the other tests that came before the CRAAP test is the SAILS test: Standardized Assessment of Information Literacy Skills, created in 2002 by a group of librarians at Kent State University as an assessment for students' information literacy skills. The SAILS test focuses more on the scores as a quantitative measure of how well students look up their sources.[4] While the SAILS test is more specific in its terms of evaluation, it shares the same objectives as the CRAAP test.

Website evaluation

One university has started using the CRAAP test to help teach students about online content evaluation. In a 2017 article, Cara Berg, a reference librarian and co-coordinator of user education at William Paterson University emphasizes website evaluation as a tool for active research.[5] At Berg's university, for example, library instruction is given to roughly 300 different classes, each in different subjects that require some type of research that require students to look up sources. Website evaluation using the CRAAP test was incorporated as part of the first year seminar for students at this university, to help them hone their research skills.

Challenges in the classroom

When the CRAAP test was first implemented at William Paterson University, there were some technical challenges. The workshop for website evaluation felt rushed and in most cases, librarians could not cover all the angles in one class session. As well, as a consequence of rushing the website evaluation portion for reasons of time, student performance on an assessment focused on website evaluation was poor. To address these problems, they developed a "flipped" method in which students watched a video that covered two of three workshop sections on their own time, with in-class instruction limited to website evaluation yet occupying all of a class period. Student performance on assessments of their knowledge of CRAAP for website evaluation improved after the change to instruction.

Pedagogical uses

The CRAAP test is generally used in library instruction as part of a first-year seminar for students. Students were required to participate in this class as a part of the graduation requirement at William Paterson University. Besides using the CRAAP method in English courses, many other courses have been using this method as well, such as science and engineering classes. The test is applied the same way as the website evaluation and is used universally in all courses. Examples of universities that use the CRAAP test include Central Michigan University,[6] Benedictine University,[7] Community College of Baltimore County,[8] among the many examples. There are other schools that use the test as a way for students to do well on their assignments in subjects that require research papers.

Alternatives and criticisms

See also: Heuristic-systematic model of information processing.

In 2004, Marc Meola's paper "Chucking the Checklist" critiqued the checklist approach to evaluating information,[9] and librarians and educators have explored alternative approaches.

Mike Caulfield, who has criticized some uses of the CRAAP test in information literacy,[10] has emphasized an alternative approach using step-by-step heuristics that can be summarized by the acronym SIFT: "Stop; Investigate the source; Find better coverage; Trace claims, quotes, and media to the original context".[11] [12]

In a December 2019 article, Jennifer A. Fielding raised the issue that the CRAAP method's focus is on a "deep-dive" into the website being evaluated, but noted that "in recent years the dissemination of mis- and disinformation online has become increasingly sophisticated and prolific, so restricting analysis to a single website's content without understanding how the site relates to a wider scope now has the potential to facilitate the acceptance of misinformation as fact."[13] Fielding contrasted use of the CRAAP method, a "vertical reading" of a single website, with "lateral reading", a fact-checking method to find and compare multiple sources of information on the same topic or event.[13]

In a 2020 working paper, Sam Wineburg, Joel Breakstone, Nadav Ziv and Mark Smith found that using the CRAAP method for information literacy education makes "students susceptible to misinformation". According to these authors the method needs thorough adaptation in order to help students detect fake news and biased or satirical sources in the digital age.[14]

References

  1. Web site: LibGuides: Literature Reviews: Evaluating Info. Korber. Irene. libguides.csuchico.edu. en. 2018-05-21. 2018-05-08. https://web.archive.org/web/20180508150205/http://libguides.csuchico.edu/c.php?g=414315&p=2822716. live.
  2. Web site: Library Staff Directory Meriam Library. library.csuchico.edu. en. 2018-05-27. 2018-09-05. https://web.archive.org/web/20180905000147/https://library.csuchico.edu/directory-cards. live.
  3. Blakeslee. Sarah. 2004. The CRAAP Test. LOEX Quarterly. en. 31. 3. 2018-05-28. 2018-06-12. https://web.archive.org/web/20180612143806/http://commons.emich.edu/loexquarterly/vol31/iss3/4/. live.
  4. Web site: Project SAILS: Standardized Assessment of Information Literacy Skills. May 29, 2018. Project SAILS. June 3, 2018. May 28, 2018. https://web.archive.org/web/20180528172828/https://www.projectsails.org/. live.
  5. Berg . Cara . Teaching Website Evaluation The CRAAP Test and the Evolution of an Approach . Internet@schools . March–April 2017 . 24 . 2 . 8–11 . 25 July 2019 . 10 August 2019 . https://web.archive.org/web/20190810124434/http://www.internetatschools.com/Articles/Editorial/Features/Teaching-Website-Evaluation-The-CRAAP-Test-and-the-Evolution-of-an-Approach-116769.aspx . dead .
  6. Web site: Research Guides: Website Research: CRAAP Test. Renirie. Rebecca. libguides.cmich.edu. en. 2018-06-12. 2017-12-27. https://web.archive.org/web/20171227080230/http://libguides.cmich.edu/web_research/craap. live.
  7. Web site: Research Guides: Evaluating Sources: The CRAAP Test. Hopkins. Joan. researchguides.ben.edu. en. 2018-06-12. 2018-06-12. https://web.archive.org/web/20180612140211/http://researchguides.ben.edu/source-evaluation. live.
  8. Web site: Research Guides: Evaluate It! : C.R.A.A.P. Criteria. Casey. Sharon. libraryguides.ccbcmd.edu. en. 2018-06-12. https://web.archive.org/web/20180612140445/http://libraryguides.ccbcmd.edu/evaluate-it/craap. 2018-06-12. dead.
  9. Lenker . Mark . October 2017 . Developmentalism: Learning as the Basis for Evaluating Information . . 17 . 4 . 721–737 . 10.1353/pla.2017.0043 . 148728541 . 2019-12-19 . 2019-01-01 . https://web.archive.org/web/20190101001731/https://muse.jhu.edu/article/672181 . live .
  10. Web site: Caulfied . Mike . September 14, 2018 . A Short History of CRAAP . hapgood.us . 2019-06-14 . 2019-04-01 . https://web.archive.org/web/20190401040402/https://hapgood.us/2018/09/14/a-short-history-of-craap/ . live .
  11. Web site: Fister . Barbara . MacMillan . Margy . May 31, 2019 . Mike Caulfield: Truth Is in the Network: Smart Talk Interview, no. 31 . projectinfolit.org . . 2019-06-14 . 2019-08-06 . https://web.archive.org/web/20190806035750/https://www.projectinfolit.org/mike-caulfield-smart-talk.html . live .
  12. See also: Web site: Stellino . Molly . December 12, 2018 . Shortcut roundup: quick guides to become media literate . newscollab.org . . 2019-06-19 . 2019-04-06 . https://web.archive.org/web/20190406183138/https://newscollab.org/2018/12/12/shortcut-roundup-quick-guides-to-become-media-literate/ . live . Stellino lists Caulfield's four moves (an earlier version of SIFT) alongside other acronyms and heuristics and then summarizes the common factors that she sees in all of them.
  13. Fielding. Jennifer A.. December 2019. Rethinking CRAAP: Getting students thinking like fact-checkers in evaluating web sources. C&RL News. 80. 11. 620–622. 10.5860/crln.80.11.620 . 214267304 . 2019-12-31. 2019-12-31. https://web.archive.org/web/20191231235436/https://crln.acrl.org/index.php/crlnews/article/view/24195. live. free.
  14. Wineburg. Sam. Breakstone. Joel. Ziv. Nadav. Smith. Mark. 2020. Educating for Misunderstanding: How Approaches to Teaching Digital Literacy Make Students Susceptible to Scammers, Rogues, Bad Actors, and Hate Mongers. Stanford History Working Group Working Paper. Working Paper A-21322. 2021-08-11. 2021-08-11. https://web.archive.org/web/20210811104951/https://purl.stanford.edu/mf412bt5333. live.