Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published (or occasionally private) communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.
To a considerable extent the quality of work, in the absence of other criteria, is judged on the number of citations received, adjusting for the volume of work in the relevant topic. While this is not necessarily a reliable measure, counting citations is trivially easy; judging the merit of complex work can be very difficult.
Previous work may be cited regarding experimental procedures, apparatus, goals, previous theoretical results upon which the new work builds, theses, and so on. Typically such citations establish the general framework of influences and the mindset of research, and especially as "part of what science" it is, and to help determine who conducts the peer review.
In patent law the citation of previous works, or prior art, helps establish the uniqueness of the invention being described. The focus in this practice is to claim originality for commercial purposes, and so the author is motivated to avoid citing works that cast doubt on its originality. Thus this does not appear to be "scientific" citation. Inventors and lawyers have a legal obligation to cite all relevant art; not to do so risks invalidating the patent. The patent examiner is obliged to list all further prior art found in searches.
Citation analysis is a method widely used in metascience:
Modern scientists are sometimes judged by the number of times their work is cited by others—this is actually a key indicator of the relative importance of a work in science. Accordingly, individual scientists are motivated to have their own work cited early and often and as widely as possible, but all other scientists are motivated to eliminate unnecessary citations so as not to devalue this means of judgment[1] . A formal citation index tracks which referred and reviewed papers have referred which other such papers. Baruch Lev and other advocates of accounting reform consider the number of times a patent is cited to be a significant metric of its quality, and thus of innovation. Reviews often replace citations to primary studies.[2]
Citation-frequency is one indicator used in scientometrics.
Some studies explore citations and citation-frequencies. Researchers found that papers in leading journals with findings that can not be replicated tend to be cited more than reproducible science. Results that are published unreproducibly – or not in a replicable sufficiently transparent way – are more likely to be wrong, may slow progress and, according to an author, "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed. The authors also put forward possible explanations for this state of affairs.[3] [4]
Two metascientists reported that in a growing scientific field, citations disproportionately cite already well-cited papers, possibly slowing and inhibiting canonical progress to some degree in some cases. They find that "structures fostering disruptive scholarship and focusing attention on novel ideas" could be important.[5] [6] [7]
Other metascientists introduced the 'CD index' intended to characterize "how papers and patents change networks of citations in science and technology" and reported that it has declined, which they interpreted as "slowing rates of disruption". They proposed linking this to changes which they interpreted as "contemporary discovery and invention" being informed by "a narrower scope of existing knowledge". The overall number of papers has risen while the total of "highly disruptive" papers has not. The 1998 discovery of the accelerating expansion of the universe has a CD index of 0. Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion".[8] [6] [9]
Recommendation systems sometimes also use citations to find similar studies to the one the user is currently reading or that the user may be interested in and may find useful.[10] Better availability of integrable open citation information could be useful in addressing the "overwhelming amount of scientific literature".[11]
Knowledge agents may use citations to find studies that are relevant to the user's query, in particular citation statements are used by scite.ai to answer a question, also providing the associated reference(s).[12]
There also has been analysis of citations of science information on Wikipedia or of scientific citations on the site, e.g. enabling listing the most relevant or most-cited scientific journals and categories and dominant domains. Since 2015, the altmetrics platform Altmetric.com also shows citing English Wikipedia articles for a given study, later adding other language editions.[13] [14] The Wikimedia platform under development Scholia also shows "Wikipedia mentions" of scientific works.[15] A study suggests a citation on Wikipedia "could be considered a public parallel to scholarly citation".[16] A scientific publication being "cited in a Wikipedia article is considered an indicator of some form of impact for this publication" and it may be possible to detect certain publications through changes to Wikipedia articles.[17] Wikimedia Research's Cite-o-Meter tool showed a league table of which academic publishers are most cited on Wikipedia[16] as does a page by the "Academic Journals WikiProject".[18] [19] Research indicates a large share of academic citations on the platform are paywalled and hence inaccessible to many readers.[20] [21] "[citation needed]" is a tag added by Wikipedia editors to unsourced statements in articles requesting citations to be added.[22] The phrase is reflective of the policies of verifiability and no original research on Wikipedia and has become a general Internet meme.[23]
The tool scite.ai tracks and links citations of papers as 'Supporting', 'Mentioning' or 'Contrasting' the study, differentiating between these contexts of citations to some degree which may be useful for evaluation/metrics and e.g. discovering studies or statements contrasting statements within a specific study.[24] [25]
The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern".[26] Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction.[26] Research found "that authors tend to keep citing retracted papers long after they have been red flagged, although at a lower rate".[27]