Journal ranking explained

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

Measures

Traditionally, journal ranking "measures" or evaluations have been provided simply through institutional lists established by academic leaders or through a committee vote. These approaches have been notoriously politicized and inaccurate reflections of actual prestige and quality, as they would often reflect the biases and personal career objectives of those involved in ranking the journals; also causing the problem of highly disparate evaluations across institutions.[1] Consequently, many institutions have required external sources of evaluation of journal quality. The traditional approach here has been through surveys of leading academics in a given field, but this approach too has potential for bias, though not as profound as that seen with institution-generated lists.[2] Consequently, governments, institutions, and leaders in scientometric research have turned to a litany of observed bibliometric measures on the journal level that can be used as surrogates for quality and thus eliminate the need for subjective assessment.[1]

Consequently, several journal-level metrics have been proposed, most citation-based:

Discussion

Negative consequences of rankings are generally well-documented and relate to the performativity of using journal rankings for performance measurement purposes.[20] [21] Studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank",[22] contrary to widespread expectations.[23]

For example, McKinnon (2017) has analyzed how the ABS-AJG ranking, which in spite of its methodological shortcomings is widely accepted in British business schools, has had negative consequences for the transportation and logistics management disciplines.[24] A study published in 2021 compared the Impact Factor, Eigenfactor Score, SCImago Journal & Country Rank and the Source Normalized Impact per Paper, in journals related to Pharmacy, Toxicology and Biochemistry. It discovered there was "a moderate to high and significant correlation" between them.[25]

Thousands of universities and research bodies issued official statements denouncing the idea that research quality can be measured based on the uni-dimensional scale of a journal ranking, most notably by signing the San Francisco Declaration on Research Assessment (DORA), which asked "not [to] use journal-based metrics ... as a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions".[26] The Community for Responsible Research in Business Management (cRRBM) asks whether "even the academy is being served when faculty members are valued for the quantity and placement of their articles, not for the benefit their research can have for the world".[27] Some academic disciplines such as management exhibit a journal ranking lists paradox: on the one hand, researchers are aware of the numerous limitations of ranking lists and their deleterious impact on scientific progress; on the other hand, they generally find journal ranking lists to be useful and employ them, in particular, when the use of ranking lists is not mandated by their institutions.[28]

National rankings

Several national and international rankings of journals exist, e.g.:

They have been introduced as official research evaluation tools in several countries.[41]

See also

Notes and References

  1. Lowry . Paul Benjamin . Gaskin . James . Humpherys . Sean L. . Moody . Gregory D. . Galletta . Dennis F. . Barlow . Jordan B. . Wilson . David W. . 2013 . Evaluating Journal Quality and the Association for Information Systems Senior Scholars' Journal Basket Via Bibliometric Measures: Do Expert Journal Assessments Add Value? . . 37 . 4 . 993–1012 . 2186798 . 43825779 . 10.25300/MISQ/2013/37.4.01. Also, see YouTube video narrative of this paper at:
  2. Lowry . Paul Benjamin . Romans . Denton . Curtis . Aaron . 2004 . Global Journal Prestige and Supporting Disciplines: A Scientometric Study of Information Systems Journals . . 5 . 2 . 29–77 . 666145 . 10.17705/1jais.00045. free .
  3. Minasny . Budiman . Hartemink . Alfred E. . McBratney . Alex . Jang . Ho-Jun . 2013-10-22 . Citations and the h index of soil researchers and journals in the Web of Science, Scopus, and Google Scholar . . 1 . e183 . 24167778 . 3807595 . 2167-8359 . 10.7717/peerj.183 . free .
  4. Serenko . Alexander . Dohan . Michael . 2011 . Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence . . . 5 . 4 . 629–648 . 10.1016/j.joi.2011.06.002.
  5. Web site: About OOIR: Journal-level data. 2023-03-14.
  6. Holsapple . Clyde W. . 2008 . A publication power approach for identifying premier information systems journals . . . 59 . 2 . 166–185 . 10.1002/asi.20679.
  7. Serenko . Alexander . Jiao . Changquan . 2012 . Investigating Information Systems Research in Canada . . Wiley Online Library . 29 . 3–24 . 10.1002/CJAS.214.
  8. Book: Alhoori . Hamed . Furuta . Richard . 2013 . Can Social Reference Management Systems Predict a Ranking of Scholarly Venues? . Lecture Notes in Computer Science . Research and Advanced Technology for Digital Libraries . . 8092 . 138–143 . 10.1007/978-3-642-40501-3_14 . 978-3-64240-500-6 . 10.1.1.648.3770 .
  9. Cornillier . Fabien . Charles . Vincent . 2015 . Measuring the attractiveness of academic journals: A direct influence aggregation model . Operations Research Letters . 43 . 2 . 172–176 . 10.1016/j.orl.2015.01.007. 13310055 .
  10. Elsevier Announces Enhanced Journal Metrics SNIP and SJR Now Available in Scopus . Elsevier . 2014-07-27.
  11. Moed . Henk . 2010 . Measuring contextual citation impact of scientific journals . Journal of Informetrics . Elsevier . 4 . 3 . 256–277 . 10.1016/j.joi.2010.01.002 . 0911.2632 . 10644946.
  12. Web site: Journal Tier List . 2024-07-31 . www.journaltierlist.com.
  13. Pinski . Gabriel . Narin . Francis . 1976 . Citation influence for journal aggregates of scientific publications: Theory with application to literature of physics . . 12 . 5 . 297–312 . 10.1016/0306-4573(76)90048-0.
  14. Liebowitz . S. J. . Palmer . J. P. . 1984 . Assessing the relative impacts of economics journals . . 22 . 1 . 77–88 . 2725228.
  15. Palacios-Huerta . Ignacio . Volij . Oscar . 2004 . The Measurement of Intellectual Influence . . 72 . 3 . 963–977 . 10.1111/j.1468-0262.2004.00519.x . 10.1.1.165.6602.
  16. Kodrzycki . Yolanda K. . Yu . Pingkang . 2006 . New Approaches to Ranking Economics Journals . Contributions to Economic Analysis & Policy . 5 . 1 . 10.2202/1538-0645.1520 . 10.1.1.178.7834.
  17. Book: Bollen . Johan . Rodriguez . Marko A. . Van De Sompel . Herbert . Proceedings of the 7th ACM/IEEE-CS joint conference on Digital libraries . MESUR: Usage-based metrics of scholarly impact . December 2006 . . 69 . 3 . cs.GL/0601030 . 2006cs........1030B . 3115544 . 669–687 . 10.1145/1255175.1255273 . 978-1-59593-644-8.
  18. Carl Bergstrom . Bergstrom . C. T. . May 2007 . Eigenfactor: Measuring the value and prestige of scholarly journals . . 68 . 5 . 314–316 . 10.5860/crln.68.5.7804 . free.
  19. Web site: West . Jevin Darwin . Eigenfactor.org . . 2014-05-18.
  20. Espeland . Wendy Nelson . Sauder . Michael . 2007 . Rankings and Reactivity: How Public Measures Recreate Social Worlds . American Journal of Sociology . 113 . 1–40 . 10.1086/517897 . 113406795 . 1885/30995 . free.
  21. Grant . David B. . Kovács . Gyöngyi . Spens . Karen . 2018 . Questionable research practices in academia: Antecedents and consequences . European Business Review . 30 . 2 . 101–127 . 10.1108/EBR-12-2016-0155.
  22. Brembs. Björn. Prestigious Science Journals Struggle to Reach Even Average Reliability . Frontiers in Human Neuroscience . 12 . 37 . 2018 . 29515380 . 5826185 . 10.3389/fnhum.2018.00037 . free.
  23. Chris R. Triggle. Ross. MacDonald. David J.. Triggle. Donald. Grierson. Requiem for impact factors and high publication charges. Accountability in Research. 2022-04-03. 133–164. 29. 3. 10.1080/08989621.2021.1909481. 33787413 . 232430287 . One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: “There is a growing body of evidence against our subjective notion of more prestigious journals publishing ‘better’ science. In fact, the most prestigious journals may be publishing the least reliable science.”. free.
  24. McKinnon . Alan C. . 2017 . 10.1108/IJPDLM-02-2017-0097 . Starry-eyed II: The logistics journal ranking debate revisited . International Journal of Physical Distribution & Logistics Management . 47 . 6 . 431–446.
  25. Aquino-Canchari . Christian Renzo . Ospina-Meza . Richard Fredi . Guillen-Macedo . Karla . 2020-07-30 . Las 100 revistas de mayor impacto sobre farmacología, toxicología y farmacia . Revista Cubana de Investigaciones Biomédicas . 39 . 3 . 1561-3011.
  26. Web site: Home. DORA.
  27. Glick . William . Tsui . Anne . Davis . Gerald . Cutler . Dave . 2018-05-02 . The Moral Dilemma to Business Research . BizEd Magazine . dead . https://web.archive.org/web/20180507120800/https://bized.aacsb.edu/articles/2018/05/the-moral-dilemma-to-business-research . 2018-05-07.
  28. Serenko . Alexander . Bontis . Nick . 2024 . Dancing with the Devil: The use and perceptions of academic journal ranking lists in the management field . . in-press . 4 . 773–792 . 10.1108/JD-10-2023-0217. 266921800 .
  29. Web site: 2011-06-12 . Australian Research Council ranking of journals worldwide . dead . https://web.archive.org/web/20110612153752/http://www.arc.gov.au/era/default.htm . 2011-06-12.
  30. Li . Xiancheng . Rong . Wenge . Shi . Haoran . Tang . Jie . Xiong . Zhang . The impact of conference ranking systems in computer science: a comparative regression analysis . Scientometrics . Springer Science and Business Media LLC . 116 . 2 . 2018-05-11 . 0138-9130 . 10.1007/s11192-018-2763-1 . 879–907. 255013801 .
  31. Web site: CORE Rankings Portal . core.edu.au . 2022-12-27.
  32. Web site: Uddannelses- og Forskningsministeriet .
  33. Web site: Julkaisufoorumi . December 2023 .
  34. Web site: Search in Norwegian List | Norwegian Register .
  35. Web site: Rating of Scientific Journals – ANVUR – Agenzia Nazionale di Valutazione del Sistema Universitario e della Ricerca .
  36. Web site: Chartered Association of Business Schools – Academic Journal Guide.
  37. Web site: List of HEC Recognized Journals .
  38. Web site: NAAS Score of Science Journals . . 2022-01-01 . https://web.archive.org/web/20230315022226/http://naas.org.in/NJS/journals2022.pdf . 2023-03-15 . live.
  39. Web site: Polish Ministry of Higher Education and Science (2019). www.bip.nauka.gov.pl. 2019-10-12.
  40. Web site: Polish Ministry of Higher Education and Science (2021). www.bip.nauka.gov.pl. 2021-02-09.
  41. The controversial policies of journal ratings: Evaluating social sciences and humanities. 10.3152/095820210X12809191250889. 2010. Pontille. David. Torny. Didier. Research Evaluation. 19. 5. 347–360. 53387400 .