The reliability of Wikipedia and its user-generated editing model, particularly its English-language edition, has been questioned and tested. Wikipedia is written and edited by volunteer editors, who generate online content with the editorial oversight of other volunteer editors via community-generated policies and guidelines. The reliability of the project has been tested statistically through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in its editing process.[1] The online encyclopedia has been criticized for its factual unreliability, principally regarding its content, presentation, and editorial processes. Studies and surveys attempting to gauge the reliability of Wikipedia have mixed results. Wikipedia's reliability was frequently criticized in the 2000s but has been improved; its English-language edition has been generally praised in the late 2010s and early 2020s.[2] [3] [4]
Select assessments of its reliability have examined how quickly vandalism—content perceived by editors to constitute false or misleading information—is removed. Two years after the project was started, in 2004, an IBM study found that "vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects".[5] The inclusion of false or fabricated content has, at times, lasted for years on Wikipedia due to its volunteer editorship. Its editing model facilitates multiple systemic biases, namely selection bias, inclusion bias, participation bias, and group-think bias. The majority of the encyclopedia is written by male editors, leading to a gender bias in coverage, and the make up of the editing community has prompted concerns about racial bias, spin bias, corporate bias, and national bias, among others.[6] [7] [8] An ideological bias on Wikipedia has also been identified on both conscious and subconscious levels. A series of studies from Harvard Business School in 2012 and 2014 found Wikipedia "significantly more biased" than Encyclopædia Britannica but attributed the finding more to the length of the online encyclopedia as opposed to slanted editing.[9] [10]
Instances of non-neutral or conflict-of-interest editing and the use of Wikipedia for "revenge editing" has attracted attention to false, biased, or defamatory content in articles, especially biographies of living people.[11] [12] Articles on less technical subjects, such as the social sciences, humanities, and culture, have been known to deal with misinformation cycles, cognitive biases, coverage discrepancies, and editor disputes. The online encyclopedia does not guarantee the validity of its information. It is seen as a valuable "starting point" for researchers when they pass over content to examine the listed references, citations, and sources. Academics suggest reviewing reliable sources when assessing the quality of articles.[13] [14]
Its coverage of medical and scientific articles such as pathology,[15] toxicology, oncology,[16] pharmaceuticals, and psychiatry[17] were compared to professional and peer-reviewed sources in a 2005 Nature study.[18] A year later Encyclopædia Britannica disputed the Nature study, whose authors, in turn, replied with a further rebuttal.[19] [20] Concerns regarding readability and the overuse of technical language were raised in studies published by the American Society of Clinical Oncology (2011),[21] Psychological Medicine (2012), and European Journal of Gastroenterology and Hepatology (2014).[22] The Simple English Wikipedia serves as a simplified version of articles to make complex articles more accessible to the layperson on a given topic in Basic English. Wikipedia's popularity, mass readership, and free accessibility has led the encyclopedia to command a substantial second-hand cognitive authority across the world.[23] [24]
Wikipedia allows anonymous editing; contributors (known as "editors") are not required to provide any identification or an email address. A 2009 study of Dartmouth College in the English Wikipedia noted that, contrary to usual social expectations, anonymous editors were some of Wikipedia's most productive contributors of valid content.[25] The Dartmouth study was criticized by John Timmer of Ars Technica for its methodological shortcomings.[26]
Wikipedia trusts the same community to self-regulate and become more proficient at quality control. Wikipedia has harnessed the work of millions of people to produce the world's largest knowledge-based site along with software to support it, resulting in more than nineteen million articles written, across more than 280 different language versions, in fewer than twelve years.[27] For this reason, there has been considerable interest in the project both academically and from diverse fields such as information technology, business, project management, knowledge acquisition, software programming, other collaborative projects and sociology, to explore whether the Wikipedia model can produce quality results, what collaboration in this way can reveal about people and whether the scale of involvement can overcome the obstacles of individual limitations and poor editorship which would otherwise arise.
Wikipedia's degree of truthfulness extends from its technology, policies, and editor culture. Edit histories are publicly visible. Footnotes show the origins of claims. Editors remove unverifiable claims and overrule ("revert") claims not phrased in a neutral point of view (NPOV). Wikipedia editors also tend towards self-examination and acknowledge Wikipedia's flaws. Its open model permits article-tampering (vandalism) including short-lived jokes and longer hoaxes. Some editors dedicate as much time to trolling (creating vandalism, spam, and harassment) as other do improving the encyclopedia. The English Wikipedia's editor pool, roughly 40,000 active editors who make five edits monthly, largely skews male and white, leading to gender- and race-based systemic biases in coverage. Variations in coverage mean that Wikipedia can be both, as online communities professor Amy S. Bruckman put it, "the most accurate form of information ever created by humans" on the whole while short articles can be "total garbage".
Academics view Wikipedia as representing a "consensus truth" in which readers can check reality in an age of contested facts. For example, when facts surrounding the COVID-19 pandemic rapidly changed or were debated, editors removed claims that did not adhere to the "verifiability" and "NPOV" guidelines.
See main article: Wikipedia and fact-checking.
Fact-checking of Wikipedia is the process through which Wikipedia editors perform fact-checking of content published in Wikipedia, while fact-checking using Wikipedia is the use of Wikipedia for fact-checking other publications. The broader topic of fact checking in the context of Wikipedia also includes the cultural discussion of the place of Wikipedia in fact-checking. Major platforms including YouTube[28] and Facebook[29] use Wikipedia's content to confirm the accuracy of information in their own media collections. Seeking public trust is a major part of Wikipedia's publication philosophy.[30]
Wikipedia has grown beyond a simple encyclopedia to become what The New York Times called a "factual netting that holds the digital world together".[31] Common questions asked of search engines are answered using knowledge ingested from Wikipedia, and often credit or link to Wikipedia as their source. Wikipedia is likely the most important single source used to train generative artificial intelligence (AI) models, such as ChatGPT, for which Wikipedia is valued as a well-curated data set with highly structured formatting.[32] The accuracy of AI models depend on the quality of their training data, but these models are also fundamentally unable to cite their original source for their knowledge, thus AI users use Wikipedia knowledge without knowing that Wikipedia is its source. AI users also receive results that intertwine facts originating from Wikipedia with fictional data points (AI hallucinations), lowering the quality of information absent a real-time fact-check during information retrieval.
The reliability of Wikipedia articles can be measured by the following criteria:
Several "market-oriented" extrinsic measures demonstrate that large audiences trust Wikipedia in one way or another. For instance, "50 percent of [US] physicians report that they've consulted ... [Wikipedia] for information on health conditions", according to a report from IMS Institute for Healthcare Informatics.[34]
On October 24, 2005, the British newspaper, The Guardian, published a story entitled "Can you trust Wikipedia?" in which a panel of experts were asked to review seven entries related to their fields, giving each article reviewed a number designation from 0 to 10. Most of these reviewed articles received marks between 5 and 8. The most common critiques were poor prose, or ease-of-reading issues (three mentions), omissions or inaccuracies, often small but including key omissions in some articles (three mentions), and poor balance, with less important areas being given more attention and vice versa (one mention). The most common praises were factually sound and correct, no glaring inaccuracies (four mentions), and much useful information, including well-selected links, making it possible to "access much information quickly" (three mentions).
In December 2005, the journal Nature published results of an attempted blind study seeking reviewer evaluations of the accuracy of a small subset of articles from Wikipedia and Encyclopædia Britannica. The non-peer-reviewed study was based on Natures selection of 42 articles on scientific topics, including biographies of well-known scientists. Factual errors, omissions or misleading statements found in the sampled articles was 162 for Wikipedia and 123 for Britannica (4:3). For serious errors, such as misinterpretations of important concepts, 4 were found in Wikipedia, and 4 in Britannica (1:1). The study concluded that "Wikipedia comes close to Britannica in terms of the accuracy of its science entries",[18] although Wikipedia's articles were often "poorly structured".[18]
Encyclopædia Britannica expressed concerns, leading Nature to release further documentation of its survey method.[35] Based on this additional information, Encyclopædia Britannica denied the validity of the Nature study, stating that it was "fatally flawed". Among Britannicas criticisms were that excerpts rather than the full texts of some of their articles were used, that some of the extracts were compilations that included articles written for the youth version, that Nature did not check the factual assertions of its reviewers, and that many points the reviewers labeled as errors were differences of editorial opinion. Britannica further stated that "While the heading proclaimed that 'Wikipedia comes close to Britannica in terms of the accuracy of its science entries,' the numbers buried deep in the body of the article said precisely the opposite: Wikipedia in fact had a third more inaccuracies than Britannica. (As we demonstrate below, 's research grossly exaggerated 's inaccuracies, so we cite this figure only to point out the slanted way in which the numbers were presented.)"[36] Nature acknowledged the compiled nature of some of the Britannica extracts, but denied that this invalidated the conclusions of the study.[37] Encyclopædia Britannica also argued that a breakdown of the errors indicated that the mistakes in Wikipedia were more often the inclusion of incorrect facts, while the mistakes in Britannica were "errors of omission", making "Britannica far more accurate than Wikipedia, according to the figures".[36] Nature has since rejected the Britannica response,[38] stating that any errors on the part of its reviewers were not biased in favor of either encyclopedia, that in some cases it used excerpts of articles from both encyclopedias, and that Britannica did not share particular concerns with Nature before publishing its "open letter" rebuttal.[39] [40]
The point-for-point disagreement between these two parties that addressed the compilation/text excerpting and very small sample size issues—argued to bias the outcome in favor of Wikipedia, versus a comprehensive, full article, large sample size study favoring the quality-controlled format of Britannica—have been echoed in online discussions,[41] [42] including of articles citing the Nature study, e.g., where a "flawed study design" for manual selection of articles/article portions, the lack of study "statistical power" in its comparing 40 articles from over 100,000 Britannica and over 1 million English Wikipedia articles, and the absence of any study statistical analyses (e.g., reported confidence intervals for study results) has also been noted.[43]
In June 2006, Roy Rosenzweig, a professor specializing in American history, published a comparison of the Wikipedia biographies of 25 Americans to the corresponding biographies found on Encarta and American National Biography Online. He wrote that Wikipedia is "surprisingly accurate in reporting names, dates, and events in U.S. history" and described some of the errors as "widely held but inaccurate beliefs". However, he stated that Wikipedia often fails to distinguish important from trivial details, and does not provide the best references. He also complained about Wikipedia's lack of "persuasive analysis and interpretations, and clear and engaging prose".[44]
A web-based survey conducted from December 2005 to May 2006 by Larry Press, a professor of Information Systems at California State University at Dominguez Hills, assessed the "accuracy and completeness of Wikipedia articles".[45] Fifty people accepted an invitation to assess an article. Of the fifty, seventy-six percent (76%) agreed or strongly agreed that the Wikipedia article was accurate, and forty-six percent (46%) agreed or strongly agreed that it was complete. Eighteen people compared the article they reviewed to the article on the same topic in the Encyclopædia Britannica. Opinions on accuracy were almost equal between the two encyclopedias (6 favoring Britannica, 7 favoring Wikipedia, 5 stating they were equal), and eleven of the eighteen (61%) found Wikipedia somewhat or substantially more complete, compared to seven of the eighteen (39%) for Britannica. The survey did not attempt a random selection of the participants, and it is not clear how the participants were invited.[46]
The German computing magazine c't performed a comparison of Brockhaus Multimedial, Microsoft Encarta, and the German Wikipedia in October 2004: Experts evaluated 66 articles in various fields. In overall score, Wikipedia was rated 3.6 out of 5 points (B-).[47] A second test by c't in February 2007 used 150 search terms, of which 56 were closely evaluated, to compare four digital encyclopedias: Bertelsmann Enzyklopädie 2007, Brockhaus Multimedial premium 2007, Encarta 2007 Enzyklopädie and Wikipedia. It concluded: "We did not find more errors in the texts of the free encyclopedia than in those of its commercial competitors."[48]
Viewing Wikipedia as fitting the economists' definition of a perfectly competitive marketplace of ideas, George Bragues (University of Guelph-Humber), examined Wikipedia's articles on seven top Western philosophers: Aristotle, Plato, Immanuel Kant, René Descartes, Georg Wilhelm Friedrich Hegel, Thomas Aquinas, and John Locke. Wikipedia's articles were compared to a consensus list of themes culled from four reference works in philosophy. Bragues found that, on average, Wikipedia's articles only covered 52% of consensus themes. No errors were found, though there were significant omissions.[49]
PC Pro magazine (August 2007) asked experts to compare four articles (a small sample) in their scientific fields between Wikipedia, Britannica and Encarta. In each case Wikipedia was described as "largely sound", "well handled", "performs well", "good for the bare facts" and "broadly accurate". One article had "a marked deterioration towards the end" while another had "clearer and more elegant" writing, a third was assessed as less well written but better detailed than its competitors, and a fourth was "of more benefit to the serious student than its Encarta or Britannica equivalents". No serious errors were noted in Wikipedia articles, whereas serious errors were noted in one Encarta and one Britannica article.[50]
In October 2007, the Australian magazine PC Authority published a feature article on the accuracy of Wikipedia. The article compared Wikipedia's content to other popular online encyclopedias, namely Britannica and Encarta. The magazine asked experts to evaluate articles pertaining to their field. A total of four articles were reviewed by three experts. Wikipedia was comparable to the other encyclopedias, topping the chemistry category.[51]
In December 2007, German magazine Stern published the results of a comparison between the German Wikipedia and the online version of the 15-volume edition of Brockhaus Enzyklopädie. The test was commissioned to a research institute (Cologne-based WIND GmbH), whose analysts assessed 50 articles from each encyclopedia (covering politics, business, sports, science, culture, entertainment, geography, medicine, history and religion) on four criteria (accuracy, completeness, timeliness and clarity), and judged Wikipedia articles to be more accurate on the average (1.6 on a scale from 1 to 6 versus 2.3 for Brockhaus, with 1 as the best and 6 as the worst). Wikipedia's coverage was also found to be more complete and up to date; however, Brockhaus was judged to be more clearly written, while several Wikipedia articles were criticized as being too complicated for non-experts, and many as too lengthy.[52] [53] [54]
In its April 2008 issue British computing magazine PC Plus compared the English Wikipedia with the DVD editions of World Book Encyclopedia and Encyclopædia Britannica, assessing for each the coverage of a series of random subjects. It concluded, "The quality of content is good in all three cases" and advised Wikipedia users "Be aware that erroneous edits do occur, and check anything that seems outlandish with a second source. But the vast majority of Wikipedia is filled with valuable and accurate information."[55]
A 2008 paper in Reference Services Review compared nine Wikipedia entries on historical topics to their counterparts in Encyclopædia Britannica, The Dictionary of American History and American National Biography Online. The paper found that Wikipedia's entries had an overall accuracy rate of 80 percent, whereas the other encyclopedias had an accuracy rate of 95 to 96 percent.[56]
A 2010 study assessed the extent to which Wikipedia pages about the history of countries conformed to the site's policy of verifiability. It found that, in contradiction of this policy, many claims in these articles were not supported by citations, and that many of those that were, sourced to popular media and government websites rather than to academic journal articles.[57]
In April 2011, a study was published by Adam Brown of Brigham Young University in the journal PS Political Science & Politics which examined "thousands of Wikipedia articles about candidates, elections, and officeholders". The study found that while the information in these articles tended to be accurate, the articles examined contained many errors of omission.[58]
A 2012 study co-authored by Shane Greenstein examined a decade of Wikipedia articles on United States politics and found that the more contributors there were to a given article, the more neutral it tended to be, in line with a narrow interpretation of Linus's law.[59]
Reavley et al. (2012) compared the quality of articles on select mental health topics on Wikipedia with corresponding articles in Encyclopædia Britannica and a psychiatry textbook. They asked experts to rate article content with regard to accuracy, up-to-dateness, breadth of coverage, referencing and readability. Wikipedia scored highest on all criteria except readability, and the authors concluded that Wikipedia is as good as or better than Britannica and a standard textbook.
A 2014 perspective piece in the New England Journal of Medicine examined Wikipedia pages about 22 prescription drugs to determine if they had been updated to include the most recent FDA safety warnings. It found that 41% of these pages were updated within two weeks after the warning, 23% were updated more than two weeks later, and the remaining 36% had not been updated to include the warning as of more than 1 year later as of January 2014.[60]
A 2014 study in the Journal of the American Pharmacists Association examined 19 Wikipedia articles about herbal supplements, and concluded that all of these articles contained information about their "therapeutic uses and adverse effects", but also concluded that "several lacked information on drug interactions, pregnancy, and contraindications". The study's authors therefore recommended that patients not rely solely on Wikipedia as a source for information about the herbal supplements in question.[61]
Another study published in 2014 in PLOS ONE found that Wikipedia's information about pharmacology was 99.7% accurate when compared to a pharmacology textbook, and that the completeness of such information on Wikipedia was 83.8%. The study also determined that completeness of these Wikipedia articles was lowest (68%) in the category "pharmacokinetics" and highest (91.3%) in the category "indication". The authors concluded that "Wikipedia is an accurate and comprehensive source of drug-related information for undergraduate medical education".[62]
In a 2004 interview with The Guardian, self-described information specialist and Internet consultant[63] Philip Bradley said that he would not use Wikipedia and was "not aware of a single librarian who would". He then explained that "the main problem is the lack of authority. With printed publications, the publishers have to ensure that their data are reliable, as their livelihood depends on it. But with something like this, all that goes out the window."[64]
In 2005, the library at Trent University in Ontario stated Wikipedia had many articles that are "long and comprehensive", but that there is "a lot of room for misinformation and bias [and] a lot of variability in both the quality and depth of articles". It adds that Wikipedia has advantages and limitations, that it has "excellent coverage of technical topics" and articles are "often added quickly and, as a result, coverage of current events is quite good", comparing this to traditional sources which are unable to achieve this task. It concludes that, depending upon the need, one should think critically and assess the appropriateness of one's sources, "whether you are looking for fact or opinion, how in-depth you want to be as you explore a topic, the importance of reliability and accuracy, and the importance of timely or recent information", and adds that Wikipedia can be used in any event as a "starting point".[65]
A 2006 review of Wikipedia by Library Journal, using a panel of librarians, "the toughest critics of reference materials, whatever their format", asked "long standing reviewers" to evaluate three areas of Wikipedia (popular culture, current affairs, and science), and concluded: "While there are still reasons to proceed with caution when using a resource that takes pride in limited professional management, many encouraging signs suggest that (at least for now) Wikipedia may be granted the librarian's seal of approval". A reviewer who "decided to explore controversial historical and current events, hoping to find glaring abuses" said, "I was pleased by Wikipedia's objective presentation of controversial subjects" but that "as with much information floating around in cyberspace, a healthy degree of skepticism and skill at winnowing fact from opinion are required". Other reviewers noted that there is "much variation" but "good content abounds".[66]
In 2007, Michael Gorman, former president of the American Library Association (ALA) stated in an Encyclopædia Britannica blog that "A professor who encourages the use of Wikipedia is the intellectual equivalent of a dietician who recommends a steady diet of Big Macs with everything".[67]
Information Today (March 2006) cites librarian Nancy O'Neill (principal librarian for Reference Services at the Santa Monica Public Library System) as saying that "there is a good deal of skepticism about Wikipedia in the library community" but that "she also admits cheerfully that Wikipedia makes a good starting place for a search. You get terminology, names, and a feel for the subject."[68]
PC Pro (August 2007) cites the head of the European and American Collection at the British Library, Stephen Bury, as stating "Wikipedia is potentially a good thing—it provides a speedier response to new events, and to new evidence on old items". The article concludes: "For [Bury], the problem isn't so much the reliability of Wikipedia's content so much as the way in which it's used." "It's already become the first port of call for the researcher", Bury says, before noting that this is "not necessarily problematic except when they go no further". According to Bury, the trick to using Wikipedia is to understand that "just because it's in an encyclopedia (free, web or printed) doesn't mean it's true. Ask for evidence ... and contribute."[50]
A 2006 article for the Canadian Library Association (CLA)[69] discussed the Wikipedia approach, process and outcome in depth, commenting for example that in controversial topics, "what is most remarkable is that the two sides actually engaged each other and negotiated a version of the article that both can more or less live with". The author comments that:
Shi et al. extended this analysis in discussing "The wisdom of polarized crowds" in 2017 based on content analysis of all edits to English Wikipedia articles relating to politics, social issues and science from its start to December 1, 2016. This included almost 233,000 articles representing approximately 5 percent of the English Wikipedia. They wrote: "Political speech [at least in the United States] has become markedly more polarized in recent years ... . [D]espite early promise of the world-wide-web to democratize access to diverse information, increased media choice and social networking platforms ... [create] echo chambers that ... degrade the quality of individual decisions, ... discount identity-incongruent opinions, stimulate and reinforce polarizing information ... foment conflict and even make communication counter-productive. Nevertheless, a large literature documents the largely positive effect that social differences can exert on the collaborative production of information, goods and services. Research demonstrates that individuals from socially distinct groups embody diverse cognitive resources and perspectives that, when cooperatively combined ... outperform those from homogeneous groups." They translated edit histories of millions of Wikipedia editors into a 7-point political identification scale and compared that with Wikipedia's six-level article quality score (stub, start, C, B, good, featured) assigned via a machine learning algorithm. They found that "articles attracting more attention tend to have more balanced engagement ... [and] higher polarization is associated with higher quality."[70]
Academics have also criticized Wikipedia for its perceived failure as a reliable source and because Wikipedia editors may have no expertise, competence, or credentials in the topics on which they contribute.[71] [72] Adrian Riskin, a mathematician in Whittier College commented that while highly technical articles may be written by mathematicians for mathematicians, the more general maths topics, such as the article on polynomials, are written in a very amateurish fashion with a number of obvious mistakes.[73]
Because Wikipedia cannot be considered a reliable source, the use of Wikipedia is not accepted in many schools and universities in writing a formal paper, and some educational institutions have banned it as a primary source while others have limited its use to only a pointer to external sources.[74] [75] The criticism of not being a reliable source, however, may not only apply to Wikipedia but to encyclopedias in general—some university lecturers are not impressed when students cite print-based encyclopedias in assigned work.[76] However, it seems that instructors have underestimated the use of Wikipedia in academia because of these concerns. Researchers and academics contend that while Wikipedia may not be used as a 100 percent accurate source for final papers, it is a valuable jumping off point for research that can lead to many possibilities if approached critically. What may be missing in academia is the emphasis on critical analysis in regards to the use of Wikipedia in secondary and higher education. We should not dismiss Wikipedia entirely (there are less inaccuracies than there are errors of omission) but rather begin to support it, and teach the use of Wikipedia as an education tool in tandem with critical thinking skills that will allow students to filter the information found on the online encyclopedia and help them critically analyze their findings.[77]
An empirical study conducted in 2006 by a Business School lecturer in Information Systems at the University of Nottingham,[78] the subject of a review on the technical website Ars Technica,[79] involving 55 academics asked to review specific Wikipedia articles that either were in their expert field (group 1) or chosen at random (group 2), concluded that: "The experts found Wikipedia's articles to be more credible than the non-experts. This suggests that the accuracy of Wikipedia is high. However, the results should not be seen as support for Wikipedia as a totally reliable resource as, according to the experts, 13 percent of the articles contain mistakes (10% of the experts reported factual errors of an unspecified degree, 3% of them reported spelling errors)."[80]
The Gould Library at Carleton College in Minnesota has a web page describing the use of Wikipedia in academia. It asserts that "Wikipedia is without question a valuable and informative resource", but that "there is an inherent lack of reliability and stability" to its articles, again drawing attention to similar advantages and limitations as other sources. As with other reviews, it comments that one should assess one's sources and what is desired from them, and that "Wikipedia may be an appropriate resource for some assignments, but not for others." It cited Wikipedia co-founder Jimmy Wales' view that Wikipedia may not be ideal as a source for all academic uses, and (as with other sources) suggests that at the least, one strength of Wikipedia is that it provides a good starting point for current information on a very wide range of topics.[81]
In 2007, the Chronicle of Higher Education published an article written by Cathy Davidson, Professor of Interdisciplinary Studies and English at Duke University, in which she asserts that Wikipedia should be used to teach students about the concepts of reliability and credibility.[82]
In 2008, Hamlet Isakhanli, founder and president of Khazar University, compared the Encyclopædia Britannica and English Wikipedia articles on Azerbaijan and related subjects. His study found that Wikipedia covered the subject much more widely, more accurately and in more detail, though with some lack of balance, and that Wikipedia was the best source for the first approximation.[83]
In 2011, Karl Kehm, associate professor of physics at Washington College, said: "I do encourage [my students] to use [Wikipedia] as one of many launch points for pursuing original source material. The best Wikipedia entries are well researched with extensive citations".[84]
Some academic journals do refer to Wikipedia articles, but are not elevating it to the same level as traditional references. For instance, Wikipedia articles have been referenced in "enhanced perspectives" provided on-line in the journal Science. The first of these perspectives to provide a hyperlink to Wikipedia was "A White Collar Protein Senses Blue Light" in 2002,[85] and dozens of enhanced perspectives have provided such links since then. The publisher of Science states that these enhanced perspectives "include hypernotes—which link directly to websites of other relevant information available online—beyond the standard bibliographic references".[86]
Sverrir Steinsson investigated factors that influenced the credibility of English Wikipedia in 2023, and found that "Wikipedia transformed from a dubious source of information in its early years to an increasingly reliable one over time."[87] This was due to it becoming "an active fact-checker and anti-fringe", with "pro-fringe editors" leaving the site as the Wikipedia community changed its interpretation of the NPOV policy and began to more accurately label misleading content as pseudoscience, conspiracy theory, etc., in harmony with the citations used to source that content.[88] This reinterpretation of NPOV "had meaningful consequences, turning an organization that used to lend credence and false balance to pseudoscience, conspiracy theories, and extremism into a proactive debunker, fact-checker and identifier of fringe discourse."
In his 2014 book Virtual Unreality, Charles Seife, a professor of journalism at New York University, noted Wikipedia's susceptibility to hoaxes and misinformation, including manipulation by commercial and political organizations "masquerading as common people" making edits to Wikipedia. In conclusion, Seife presented the following advice:
Seife observed that when false information from Wikipedia spreads to other publications, it sometimes alters truth itself. On June 28, 2012, for example, an anonymous Wikipedia contributor added the invented nickname "Millville Meteor" to the Wikipedia biography of baseball player Mike Trout. A couple of weeks later, a Newsday sports writer reproduced the nickname in an article, and "with that act, the fake nickname became real". Seife pointed out that while Wikipedia, by some standards, could be described as "roughly as accurate" as traditional publications, and is more up to date, "there's a difference between the kind of error one would find in Wikipedia and what one would in Britannica or Collier's or even in the now-defunct Microsoft Encarta encyclopedia ... the majority of hoaxes on Wikipedia could never have appeared in the old-fashioned encyclopedias." Dwight Garner, reviewing Seife's book in The New York Times, said that he himself had "been burned enough times by bad online information", including "Wikipedia howlers", to have adopted a very sceptical mindset.[89]
In November 2012, judge Brian Leveson was accused of having forgotten "one of the elementary rules of journalism" when he named a "Brett Straub" as one of the founders of The Independent newspaper in his report on the culture, practices and ethics of the British press. The name had been added to the Wikipedia article on The Independent over a year prior, and turned out to be that of a 25-year-old Californian, whose friend had added his name to a string of Wikipedia pages as a prank.[90] Straub was tracked down by The Telegraph and commented, "The fact someone, especially a judge, has believed something on Wikipedia is kind of shocking. My friend went on and edited a bunch of Wikipedia pages and put my name there. [...] I knew my friend had done it but I didn't know how to change them back and I thought someone would. At one point I was the creator of Coca-Cola or something. You know how easy it is to change Wikipedia. Every time he came across a red linked name he put my name in its place."[91]
A 2016 BBC article by Ciaran McCauley similarly noted that "plenty of mischievous, made-up information has found its way" on to Wikipedia and that "many of these fake facts have fallen through the cracks and been taken as gospel by everyone from university academics to major newspapers and broadcasters." Listing examples of journalists being embarrassed by reproducing hoaxes and other falsifications from Wikipedia in their writing, including false information propagated by major news organizations in their obituaries of Maurice Jarre and Ronnie Hazlehurst, McCauley stated: "Any journalist in any newsroom will likely get a sharp slap across the head from an editor for treating Wikipedia with anything but total skepticism (you can imagine the kicking I've taken over this article)."[92]
The Daily Mail—itself banned as a source on Wikipedia in 2017 because of its perceived unreliability—has publicly stated that it "banned all its journalists from using Wikipedia as a sole source in 2014 because of its unreliability".[93]
Slate said in 2022 that "Screenshots of vandalized Wikipedia articles, even when reverted within minutes, often have a much longer afterlife in news reports and on social media, creating the public impression that the platform is more vulnerable to abuse than it actually is."[94]
See main article: Health information on Wikipedia and Science information on Wikipedia. Science and medicine are areas where accuracy is of high importance and peer review is the norm. While some of Wikipedia's content has passed a form of peer review, most has not.[95]
A 2008 study examined 80 Wikipedia drug entries. The researchers found few factual errors in this set of articles, but determined that these articles were often missing important information, like contraindications and drug interactions. One of the researchers noted that "If people went and used this as a sole or authoritative source without contacting a health professional...those are the types of negative impacts that can occur." The researchers also compared Wikipedia to Medscape Drug Reference (MDR), by looking for answers to 80 different questions covering eight categories of drug information, including adverse drug events, dosages, and mechanism of action. They have determined that MDR provided answers to 82.5 percent of the questions, while Wikipedia could only answer 40 percent, and that answers were less likely to be complete for Wikipedia as well. None of the answers from Wikipedia were determined factually inaccurate, while they found four inaccurate answers in MDR. But the researchers found 48 errors of omission in the Wikipedia entries, compared to 14 for MDR. The lead investigator concluded: "I think that these errors of omission can be just as dangerous [as inaccuracies]", and he pointed out that drug company representatives have been caught deleting information from Wikipedia entries that make their drugs look unsafe.[96]
A 2009 survey asked US toxicologists how accurately they rated the portrayal of health risks of chemicals in different media sources. It was based on the answers of 937 members of the Society of Toxicology and found that these experts regarded Wikipedia's reliability in this area as far higher than that of all traditional news media:
In 2010 researchers compared information about 10 types of cancer on Wikipedia to similar data from the National Cancer Institute's Physician Data Query and concluded "the Wiki resource had similar accuracy and depth to the professionally edited database" and that "sub-analysis comparing common to uncommon cancers demonstrated no difference between the two", but that ease of readability was an issue.[97]
A study in 2011 came to the result that categories most frequently absent in Wikipedia's drug articles are those of drug interactions and medication use in breastfeeding.[98] Other categories with incomplete coverage were descriptions of off-label indications, contraindications and precautions, adverse drug events and dosing.[98] Information most frequently deviating from other sources used in the study were that of contraindications and precautions, drug absorption and adverse drug events.[98]
A 2012 study reported that Wikipedia articles about pediatric otolaryngology contained twice as many errors and omissions as the medical database eMedicine.[99]
In a U.S. study in 2014, 10 researchers examined 10 Wikipedia health articles of the most costly medical conditions in the United States and found that 90% of the entries contained errors and statements that contradicted latest medical research. However, according to Stevie Benton of Wikimedia UK the sample size used in the research may have been too small to be considered representative.[100] [101] Only part of the data was made public, and for two statements that were released for other researchers to examine, the claim that Wikipedia's statements were contradictory to the peer-reviewed literature was called into question.
A 2014 study published in PLOS One looked at the quality of Wikipedia articles on pharmacology, comparing articles from English and German Wikipedia with academic textbooks. It found that "the collaborative and participatory design of Wikipedia does generate high quality information on pharmacology that is suitable for undergraduate medical education".[102]
References to Wikipedia in United States judicial opinions have increased each year since 2004. In a 2017 ruling, the Supreme Court of Texas advised against reliance on the information in Wikipedia for judicial rulings, arguing that its lack of reliability prevents using it as a source of authority in legal opinions.[103] [104]
The Supreme Court of India in its judgment in Commr. of Customs, Bangalore vs. ACER India Pvt. (Citation 2007(12)SCALE581) held that "We have referred to Wikipedia, as the learned Counsel for the parties relied thereupon. It is an online encyclopaedia and information can be entered therein by any person and as such it may not be authentic."[105]
In a 2004 piece called "The Faith-Based Encyclopedia", Robert McHenry, a former editor-in-chief of Encyclopædia Britannica, stated that Wikipedia errs in billing itself as an encyclopedia, because that word implies a level of authority and accountability that he believes cannot be possessed by an openly editable reference. McHenry argued that "the typical user doesn't know how conventional encyclopedias achieve reliability, only that they do". He added:Similarly, Britannicas executive editor, Ted Pappas, was quoted in The Guardian as saying:
In the September 12, 2006, edition of The Wall Street Journal, Jimmy Wales debated with Dale Hoiberg, editor-in-chief of Encyclopædia Britannica. Hoiberg focused on a need for expertise and control in an encyclopedia and cited Lewis Mumford that overwhelming information could "bring about a state of intellectual enervation and depletion hardly to be distinguished from massive ignorance". Wales emphasized Wikipedia's differences, and asserted that openness and transparency lead to quality. Hoiberg replied that he "had neither the time nor space to respond to [criticisms]" and "could corral any number of links to articles alleging errors in Wikipedia", to which Wales responded: "No problem! Wikipedia to the rescue with a fine article", and included a link to the Wikipedia article Criticism of Wikipedia.[106]
While experienced editors can view the article history and discussion page, for normal users it is not so easy to check whether information from Wikipedia is reliable. University projects from California, Switzerland and Germany try to improve that by methods of formal analysis and data mining. Wiki-Watch from Germany, which was inspired by the WikiBu from Switzerland, shows an evaluation up to five-stars for every English or German article in Wikipedia. Part of this rating is the tool WikiTrust which shows the trustworthiness of single text parts of Wikipedia articles by white (trustworthy) or orange (not trustworthy) markings.[107]
See main article: Circular reporting. Sources accepted as reliable for Wikipedia may rely on Wikipedia as a reference source, sometimes indirectly. If the original information in Wikipedia was false, once it has been reported in sources considered reliable, Wikipedia can use them to reference the false information, giving an apparent credibility to falsehood. This in turn increases the likelihood of the false information being reported in other media.[108] A known example is the Sacha Baron Cohen article, where false information added in Wikipedia was apparently used by two newspapers, leading to it being treated as reliable in Wikipedia.[109] This process of creating reliable sources for false facts has been termed "citogenesis" by xkcd webcomic artist Randall Munroe.[110] [111] [112]
Somewhat related to the "information loop" is the propagation of misinformation to other websites (Answers.com is just one of many) which will often quote misinformation from Wikipedia verbatim, and without mentioning that it has come from Wikipedia. A piece of misinformation originally taken from a Wikipedia article will live on in perhaps dozens of other websites, even if Wikipedia itself has deleted the unreliable material.[113]
In one article, Information Today (March 2006) likens comparisons between Wikipedia and Britannica to "apples and oranges":
Andrew Orlowski, a columnist for The Register, expressed similar criticisms in 2005, writing that the use of the term "encyclopedia" to describe Wikipedia may lead users into believing it is more reliable than it may be.[114]
BBC technology specialist Bill Thompson wrote that "Most Wikipedia entries are written and submitted in good faith, and we should not let the contentious areas such as politics, religion or biography shape our view of the project as a whole", that it forms a good starting point for serious research but that:[115] Thompson adds the observation that since most popular online sources are inherently unreliable in this way, one byproduct of the information age is a wiser audience who are learning to check information rather than take it on faith due to its source, leading to "a better sense of how to evaluate information sources".[115]
In his 2007 Guide to Military History on the Internet, Simon Fowler rated Wikipedia as "the best general resource" for military history research, and stated that "the results are largely accurate and generally free of bias".[116] When rating Wikipedia as the No. 1 military site he mentioned that "Wikipedia is often criticised for its inaccuracy and bias, but in my experience the military history articles are spot on."[117]
In July 2008, The Economist magazine described Wikipedia as "a user-generated reference service" and noted that Wikipedia's "elaborate moderation rules put a limit to acrimony" generated by cyber-nationalism.[118]
Jimmy Wales, a co-founder of Wikipedia, stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as being authoritative.[119]
Carnegie Mellon Professor Randy Pausch offered the following anecdote in his book The Last Lecture. He was surprised that his entry to World Book Encyclopedia on virtual reality was accepted without question, so he concluded, "I now believe Wikipedia is a perfectly fine source for your information, because I know what the quality control is for real encyclopedias."[120]
Fernanda Viégas of the MIT Media Lab and Martin Wattenberg and Kushal Dave of IBM Research studied the flow of editing in the Wikipedia model, with emphasis on breaks in flow (from vandalism or substantial rewrites), showing the dynamic flow of material over time.[121] From a sample of vandalism edits on the English Wikipedia during May 2003, they found that most such acts were repaired within minutes, summarizing:They also stated that "it is essentially impossible to find a crisp definition of vandalism".[121]
Lih (2004) compared articles before and after they were mentioned in the press, and found that externally referenced articles are of higher quality work. An informal assessment by the popular IT magazine PC Pro for its 2007 article "Wikipedia Uncovered" tested Wikipedia by introducing 10 errors that "varied between bleeding obvious and deftly subtle" into articles (the researchers later corrected the articles they had edited). Labeling the results "impressive" it noted that all but one was noted and fixed within the hour, and that "the Wikipedians' tools and know-how were just too much for our team." A second series of another 10 tests, using "far more subtle errors" and additional techniques to conceal their nature, met similar results: "despite our stealth attempts the vast majority... were discovered remarkably quickly... the ridiculously minor Jesse James error was corrected within a minute and a very slight change to Queen Anne's entry was put right within two minutes". Two of the latter series were not detected. The article concluded that "Wikipedia corrects the vast majority of errors within minutes, but if they're not spotted within the first day the chances... dwindle as you're then relying on someone to spot the errors while reading the article rather than reviewing the edits".
A study in late 2007 systematically inserted inaccuracies into Wikipedia entries about the lives of philosophers. Depending on how exactly the data are interpreted, either one third or one half of the inaccuracies were corrected within 48 hours.[122]
A 2007 peer-reviewed study[123] that measured the actual number of page views with damaged content stated: "42% of damage is repaired almost immediately, i.e., before it can confuse, offend, or mislead anyone. Nonetheless, there are still hundreds of millions of damaged views."[123]
Loc Vu-Quoc, professor for Mechanical and Aerospace Engineering at the University of Florida, stated in 2008 that "sometimes errors may go for years without being corrected as experts don't usually read Wikipedia articles in their own field to correct these errors".[124]
See main article: WikiScanner. In August 2007, WikiScanner, a tool developed by Virgil Griffith of the California Institute of Technology, was released to match anonymous IP edits in the encyclopedia with an extensive database of addresses. News stories appeared about IP addresses from various organizations such as the Central Intelligence Agency, the Democratic Congressional Campaign Committee, Diebold, Inc. and the Australian government being used to make edits to Wikipedia articles, sometimes of an opinionated or questionable nature.[125] The BBC quoted a Wikimedia spokesperson as praising the tool: "We really value transparency and the scanner really takes this to another level. Wikipedia Scanner may prevent an organization or individuals from editing articles that they're really not supposed to."[126]
The WikiScanner story was also covered by The Independent, which stated that many "censorial interventions" by editors with vested interests on a variety of articles in Wikipedia had been discovered: Not everyone hailed WikiScanner as a success for Wikipedia. Oliver Kamm, in a column for The Times, argued instead that:
The WikiScanner is thus an important development in bringing down a pernicious influence on our intellectual life. Critics of the web decry the medium as the cult of the amateur. Wikipedia is worse than that; it is the province of the covert lobby. The most constructive course is to stand on the sidelines and jeer at its pretensions.[127]
WikiScanner only reveals conflict of interest when the editor does not have a Wikipedia account and their IP address is used instead. Conflict of interest editing done by editors with accounts is not detected, since those edits are anonymous to everyone—except for "a handful of privileged Wikipedia admins".[128]
Wikipedia has been accused of systemic bias, which is to say its general nature leads, without necessarily any conscious intention, to the propagation of various prejudices. Although many articles in newspapers have concentrated on minor, indeed trivial, factual errors in Wikipedia articles, there are also concerns about large-scale, presumably unintentional effects from the increasing influence and use of Wikipedia as a research tool at all levels. In an article in the Times Higher Education magazine (London) philosopher Martin Cohen frames Wikipedia of having "become a monopoly" with "all the prejudices and ignorance of its creators", which he describes as a "youthful cab-drivers" perspective.[129] Cohen's argument, however, finds a grave conclusion in these circumstances: "To control the reference sources that people use is to control the way people comprehend the world. Wikipedia may have a benign, even trivial face, but underneath may lie a more sinister and subtle threat to freedom of thought."[129] That freedom is undermined by what he sees as what matters on Wikipedia, "not your sources but the 'support of the community'."[129]
Critics also point to the tendency to cover topics in a detail disproportionate to their importance. For example, Stephen Colbert once mockingly praised Wikipedia for having a "longer entry on 'lightsabers' than it does on the 'printing press'."[130] In an interview with The Guardian, Dale Hoiberg, the editor-in-chief of Encyclopædia Britannica, noted:
People write of things they're interested in, and so many subjects don't get covered; and news events get covered in great detail. In the past, the entry on Hurricane Frances was more than five times the length of that on Chinese art, and the entry on Coronation Street was twice as long as the article on Tony Blair.[64]
This critical approach has been satirised as "Wikigroaning", a term coined by Jon Hendren[131] of the website Something Awful.[132] In the game, two articles (preferably with similar names) are compared: one about an acknowledged serious or classical subject and the other about a popular topic or current event.[133] Defenders of a broad inclusion criteria have held that the encyclopedia's coverage of pop culture does not impose space constraints on the coverage of more serious subjects (see "Wiki is not paper"). Ivor Tossell wrote:
That Wikipedia is chock full of useless arcana (and did you know, by the way, that the article on "Debate" is shorter than the piece that weighs the relative merits of the 1978 and 2003 versions of Battlestar Galactica?) isn't a knock against it: Since it can grow infinitely, the silly articles aren't depriving the serious ones of space.[134]
Wikipedia has been accused of deficiencies in comprehensiveness because of its voluntary nature, and of reflecting the systemic biases of its contributors. Wikipedia co-founder Larry Sanger stated in 2004, "when it comes to relatively specialized topics (outside of the interests of most of the contributors), the project's credibility is very uneven."[135] He expanded on this 16 years later in May 2020, by comparing how coverage impacts tone between the articles of U.S. presidents Donald Trump (seen as negative) and Barack Obama (seen as positive).
In a GamesRadar editorial, columnist Charlie Barrat juxtaposed Wikipedia's coverage of video game-related topics with its smaller content about topics that have greater real-world significance, such as God, World War II and former U.S. presidents.[136] Wikipedia has been praised for making it possible for articles to be updated or created in response to current events. Its editors have also argued that, as a website, Wikipedia is able to include articles on a greater number of subjects than print encyclopedias can.[137]
A 2011 study reported evidence of cultural bias in Wikipedia articles about famous people on both the English and Polish Wikipedias. These biases included those pertaining to the cultures of both the United States and Poland on each of the corresponding-language Wikipedias, as well as a pro-U.S./English-language bias on both of them.[138]
See main article: Ideological bias on Wikipedia. Wikipedia co-founder Jimmy Wales stated in 2006: "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. There are no data or surveys to back that."[139]
A number of politically conservative commentators have argued that Wikipedia's coverage is affected by liberal bias.[140] Andrew Schlafly created Conservapedia because he found Wikipedia "increasingly anti-Christian and anti-American" for its frequent use of British spelling and coverage of topics like creationism and the effect of Christianity on the Renaissance.[141] In 2007, an article in The Christian Post criticised Wikipedia's coverage of intelligent design, saying that it was biased and hypocritical.[142] Lawrence Solomon of the National Review stated that Wikipedia articles on subjects like global warming, intelligent design, and Roe v. Wade are slanted in favor of liberal views.[143] In a September 2010 issue of the conservative weekly Human Events, Rowan Scarborough presented a critique of Wikipedia's coverage of American politicians prominent in the approaching midterm elections as evidence of systemic liberal bias. Scarborough compared the biographical articles of liberal and conservative opponents in Senate races in the Alaska Republican primary and the Delaware and Nevada general election, emphasizing the quantity of negative coverage of Tea Party movement-endorsed candidates. He also cites some criticism by Lawrence Solomon and quotes in full the lead section of Wikipedia's article on the conservative wiki Conservapedia as evidence of an underlying bias.[144] Jonathan Sidener of The San Diego Union-Tribune wrote that "vandalism and self-serving misinformation [are] common particularly in the political articles".[145] A 2015 study found that negative facts are more likely to be removed from Wikipedia articles on U.S. senators than positive facts but did not find any significant difference relating to political affiliation.[146]
Amid the George Floyd protests, there were several disputes over racial justice on Wikipedia.[140] The Wikipedia community voted against a proposal to black out the website in support of Black Lives Matter because it may have threatened Wikipedia's reputation for neutrality. It also led to the creation of the WikiProject Black Lives Matter, in line with AfroCROWD's Juneteenth efforts to improve the coverage of civil rights movement-related topics; the Black Lives Matter project was nominated for deletion on the grounds that it was "non-neutral advocacy".[140] In Wikipedia, neutrality is more of a process that is achieved through consensus. Social scientist Jackie Koerner took issue with the word neutrality and said she preferred the word balance to neutrality because she believed that one of Wikipedia's goals should be knowledge equity.[140]
The Japanese Wikipedia has been accused of right-wing historical revisionism, particularly on articles related to its role in World War II and colonialism, by a number of scholars.[147] [148] [149] [150] The issue has been the subject of research supported by the Wikimedia Foundation.[151]
Although Wikipedia is stated not to be a primary source, it has been used as evidence in legal cases. In January 2007, The New York Times reported that U.S. courts vary greatly in their treatment of Wikipedia as a source of information, with over 100 judicial rulings having relied on the encyclopedia, including those involving taxes, narcotics, and civil issues such as personal injury and matrimonial issues.[152]
In April 2012, The Wall Street Journal reported that in the five years since the 2007 The New York Times story, federal courts of appeals had cited Wikipedia about 95 times. The story also reported that the U.S. Court of Appeals for the Fourth Circuit vacated convictions in a cockfighting case because a juror used Wikipedia to research an element of the crime, expressing in its decision concerns about Wikipedia's reliability.[153]
In one notable case, the trademark of Formula One racing decision,[154] the UK Intellectual Property Office considered both the reliability of Wikipedia, and its usefulness as a reliable source of evidence:
In the United States, the United States Court of Federal Claims has ruled that "Wikipedia may not be a reliable source of information."[155] and "...Articles [from Wikipedia] do not—at least on their face—remotely meet this reliability requirement...A review of the Wikipedia website reveals a pervasive and, for our purposes, disturbing series of disclaimers...".[152] [156] Such disclaimers include the Wikipedia not being able to guarantee the validity of the information on its articles and having no formal peer review.
Among other reasons for these statements about Wikipedia's reliability are the stability of the articles (which due to editing may cause new readers to find information that differs from the originally cited) and, according to Stephen Gillers, a professor at New York University Law School, "the most critical fact is public acceptance", therefore "a judge should not use Wikipedia when the public is not prepared to accept it as authority".[157]
Wikipedia has also become a key source for some current news events such as the 2007 Virginia Tech massacre, when The New York Times cites Wikimedia to report 750,000 page views of the article in the two days after the event:
The Washington Post commented, in the context of 2008 presidential election candidate biographies, that despite occasional brief vandalism, "it's hard to find a more up-to-date, detailed, thorough article on Obama than Wikipedia's. As of Friday (14 September 2007), Obama's article—more than 22 pages long, with 15 sections covering his personal and professional life—had a reference list of 167 sources."[158]
Several commentators have drawn a middle ground, asserting that the project contains much valuable knowledge and has some reliability, even if the degree is not yet assessed with certainty. Others taking this view include danah boyd, [sic] who in 2005 discussed Wikipedia as an academic source, concluding that "[i]t will never be an encyclopedia, but it will contain extensive knowledge that is quite valuable for different purposes",[159] and Bill Thompson who stated "I use the Wikipedia a lot. It is a good starting point for serious research, but I would never accept something that I read there without checking."
Information Todays March 2006 article[68] concludes on a similar theme:
Dan Gillmor, a Silicon Valley commentator and author commented in October 2004 that, "I don't think anyone is saying Wikipedia is an absolute replacement for a traditional encyclopedia. But in the topics I know something about, I've found Wikipedia to be as accurate as any other source I've found."
Larry Sanger stated on Kuro5hin in 2001 that "Given enough eyeballs, all errors are shallow",[160] which is a paraphrase of Linus' Law of open-source development.
Likewise, technology figure Joi Ito wrote on Wikipedia's authority, "[a]lthough it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative, or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived."[161]
In a 2008 letter to the editor of Physics Today, Gregg Jaeger, an associate professor at Boston University,[162] has characterized Wikipedia as a medium that is susceptible to fostering "anarchy and distortions" in relation to scientific information.[163]
People known to use or recommend Wikipedia as a reference source include film critic Roger Ebert,[164] [165] [166] [167] comedian Rosie O'Donnell,[168] University of Maryland physicist Robert L. Park,[169] Rutgers University sociology professor Ted Goertzel[170] [171] and scientific skepticism promoter and investigator James Randi.[172] Periodicals that publish articles featuring citations of Wikipedia as a source include the American science magazines Skeptic[173] [174] and Skeptical Inquirer.[175] In the January 2013 episode of his talk show, Stossel, about how ideas can flourish without regulation, journalist John Stossel interviewed Wikipedia co-founder Jimmy Wales, and discussed the success of Wikipedia's model versus that of Britannica, during which Stossel stated that his own Wikipedia article exhibited only one error.[176]
Jean Goodwin wrote on the reasons why Wikipedia may be trusted. According to him, while readers may not assess the actual expertise of the authors of a given article, they may assess the passion of Wikipedians, and in so far provide a reason for trust.[177]
Inaccurate information may persist in Wikipedia for a long time before it is challenged. The most prominent cases reported by mainstream media involved biographies of living persons. The Seigenthaler incident demonstrated that the subject of a biographical article must sometimes fix blatant lies about his or her own life. In May 2005, a user edited the biographical article on John Seigenthaler Sr. so that it contained several false and defamatory statements.[178] The inaccurate claims went unnoticed between May and September 2005 when they were discovered by Victor S. Johnson, Jr., a friend of Seigenthaler. Wikipedia content is often mirrored at sites such as Answers.com, which means that incorrect information can be replicated alongside correct information through a number of web sources. Such information can develop a misleading air of authority because of its presence at such sites: "Then [Seigenthaler's] son discovered that his father's hoax biography also appeared on two other sites, Reference.com and Answers.com, which took direct feeds from Wikipedia. It was out there for four months before Seigenthaler realized and got the Wikipedia entry replaced with a more reliable account. The lies remained for another three weeks on the mirror sites downstream."[179]
Seth Finkelstein reported in an article in The Guardian on his efforts to remove his own biography page from Wikipedia, simply because it was subjected to defamation: "Wikipedia has a short biography of me, originally added in February 2004, mostly concerned with my internet civil liberties achievements. After discovering in May 2006 that it had been vandalised in March, possibly by a long-time opponent, and that the attack had been subsequently propagated to many other sites which (legally) repackage Wikipedia's content, the article's existence seemed to me overall to be harmful rather than helpful." He added: "For people who are not very prominent, Wikipedia biographies can be an 'attractive nuisance'. It says, to every troll, vandal, and score-settler: 'Here's an article about a person where you can, with no accountability whatsoever, write any libel, defamation, or smear. It won't be a marginal comment with the social status of an inconsequential rant, but rather will be made prominent about the person, and reputation-laundered with the institutional status of an encyclopedia.[180]
In the same article, Finkelstein recounts how he voted his own biography as "not notable enough" in order to have it removed from Wikipedia. He goes on to recount a similar story involving Angela Beesley, previously a prominent member of the foundation which runs Wikipedia. Taner Akçam, a Turkish history professor at the University of Minnesota, was detained at the Montreal airport, as his article was vandalized by Turkish nationalists in 2007. While this mistake was resolved, he was again arrested in US for the same suspicion two days later.[181]
On March 2, 2007, MSNBC reported that Hillary Clinton had been incorrectly listed for 20 months in her Wikipedia biography as valedictorian of her class of 1969 at Wellesley College. (Hillary Rodham was not the valedictorian, though she did speak at commencement.)[182] The article included a link to the Wikipedia edit,[183] where the incorrect information was added on July 9, 2005. After the msnbc.com report, the inaccurate information was removed the same day.[184]
Attempts to perpetrate hoaxes may not be confined to editing Wikipedia articles. In October 2005 Alan Mcilwraith, a former call center worker from Scotland created a Wikipedia article in which he claimed to be a highly decorated war hero. The article was quickly identified by other users as unreliable (see Wikipedia Signpost article April 17, 2006); however, Mcilwraith had also succeeded in convincing a number of charities and media organizations that he was who he claimed to be: "The 28-year-old, who calls himself Captain Sir Alan McIlwraith, KBE, DSO, MC, has mixed with celebrities for at least one fundraising event. But last night, an Army spokesman said: 'I can confirm he is a fraud. He has never been an officer, soldier or Army cadet.[185]
In May 2010, French politician Ségolène Royal publicly praised the memory of Léon-Robert de l'Astran, an 18th-century naturalist, humanist and son of a slave trader, who had opposed the slave trade. The newspaper Sud-Ouest revealed a month later that de l'Astran had never existed—except as the subject of an article in the French Wikipedia. Historian Jean-Louis Mahé discovered that de l'Astran was fictional after a student, interested by Royal's praise of him, asked Mahé about him. Mahé's research led him to realize that de l'Astran did not exist in any archives, and he traced the hoax back to the Rotary Club of La Rochelle. The article, created by members of the Club in January 2007, had thus remained online for three years—unsourced—before the hoax was uncovered. Upon Sud-Ouests revelation—repeated in other major French newspapers—French Wikipedia administrator DonCamillo immediately deleted the article.[186] [187] [188] [189] [190]
There have also been instances of users deliberately inserting false information into Wikipedia in order to test the system and demonstrate its alleged unreliability. Journalist Gene Weingarten ran such a test in 2007 by anonymously inserting false information into his own biography. The fabrications were removed 27 hours later by a Wikipedia editor who was regularly watching changes to that article.[191] Television personality Stephen Colbert lampooned this drawback of Wikipedia, calling it wikiality.[192]
"Death by Wikipedia" is a phenomenon in which a person is erroneously proclaimed dead through vandalism. Articles about the comedian Paul Reiser, British television host Vernon Kay, French professor Bertrand Meyer, and the West Virginia Senator Robert Byrd, who died on June 28, 2010, have been vandalized in this way.[193] [194] [195]
In June 2007, an anonymous Wikipedia contributor became involved in the Chris Benoit double murder and suicide because of an unverified piece of information he added to the "Chris Benoit" English Wikipedia article. This information regarding Benoit's wife's death was added fourteen hours before police discovered the bodies of Benoit and his family.[196] Police detectives seized computer equipment from the man held responsible for the postings, but believed he was uninvolved and did not press charges.[197] The IP address from which the edit was made was traced to earlier instances of Wikipedia vandalism. The contributor apologized on Wikinews, saying: "I will never vandalize anything on Wikipedia or post wrongful information. I will never post anything here again unless it is pure fact ... ."[198]
On August 29, 2008, shortly after the first round draw was completed for UEFA Europa League football cup, an edit was made to the article for the football club AC Omonia, apparently by users of the website B3ta.[199] On September 18, 2008, David Anderson, a British journalist writing for the Daily Mirror, quoted this in his match preview ahead of Omonia's game with Manchester City, which appeared in the web and print versions of the Mirror and the nickname was quoted in subsequent editions on September 19.[200] [201]
In May 2009, University College Dublin sociology student Shane Fitzgerald added an incorrect quote to the article on the recently deceased composer Maurice Jarre. Fitzgerald wanted to demonstrate the potential dangers of news reporters' reliance on the internet for information.[202] Although Fitzgerald's edits were removed three times from the Wikipedia article for lack of sourcing,[203] they were nevertheless copied into obituary columns in newspapers worldwide.[204] Fitzgerald believes that if he had not come forward his quote would have remained in history as fact.[203]
The death of Norman Wisdom in October 2010 led several major newspapers to repeat the false claim, drawn from Wikipedia, that he was the author of the lyrics of the Second World War song "(There'll Be Bluebirds Over) The White Cliffs of Dover".[205]
After the 2010 FIFA World Cup, FIFA president Sepp Blatter was presented with the Order of the Companions of Oliver Reginald Tambo. The citation, however, read: "The Order of the Companions of OR Tambo in Gold—awarded to Joseph Sepp Bellend Blatter (1936–) for his exceptional contribution to the field of football and support for the hosting of the Fifa World Cup on the African continent", after the name on his Wikipedia entry was vandalized.[206]
In October 2012, the Asian Football Confederation official website published an article about the United Arab Emirates national football team's bid to qualify for the 2015 AFC Asian Cup, in which the team's nickname was stated to be the "Sand Monkeys". This was the indirect result of vandalism of the Wikipedia article on the team, and the AFC was forced to apologise for what was perceived as a racist slur.[207] [208]
In December 2012, an article titled "Bicholim conflict" was deleted after standing since 2007.[209] It talked about a war that took place in India between the years 1640 and 1641, but was later confirmed to be completely fictitious.[210] The hoax article had won Wikipedia's "Good Article" award, a status conferred on fewer than 1 percent of articles on the site, a few months after its creation in 2007, and held that status for five years.[211]
In March 2013, it was discovered that both Wikipedia and IMDb had for three-and-a-half years contained articles on a fictitious Russian filmmaker named Yuri Gadyukin. False information had been planted in both sites as part of a viral promotion campaign for an upcoming film.[212]
In May 2014, The New Yorker reported that a 17-year-old student had added an invented nickname to the Wikipedia article on the coati in 2008, saying coatis were also known as "Brazilian aardvarks". The taxonomically false information, inserted as a private joke, lasted for six years in Wikipedia and over this time came to be propagated by hundreds of websites, several newspapers (one of which was later cited as a source in Wikipedia) and even books published by university presses. It was only removed from Wikipedia after publication of the New Yorker article, in which the student explained how the joke had come about.
In March 2015, it became known that an article on Wikipedia entitled "Jar'Edo Wens", purportedly about an Australian aboriginal deity of that name, was a hoax. The article had survived for more than nine years before being deleted, making it one of the longest-lived documented hoax articles in Wikipedia's history. The article spawned mentions of the fake god on numerous other websites as well as in a book titled Atheism and the Case Against Christ.[213] [214] [215]
In August 2019, a discredited theory was removed from the article Warsaw concentration camp, over 10 years after it was debunked in mainstream scholarly literature. The article was first drafted in August 2004 by an established editor who presented as fact a fringe theory that the camp contained gas chambers in which 200,000 people perished. With the misinformation presented as fact for 15 years, media sources dubbed it as "Wikipedia's longest-standing hoax".[216] [217] [218]
In June 2022, it was discovered that an editor known as Zhemao (Chinese: 折毛) had created over 200 articles on the Chinese Wikipedia about fabricated events in medieval Russian history.[219] Dubbed the Zhemao hoaxes, the hoax articles combined research and fantasy, creating an alternate history centered around a "Kashin silver mine" and political ties between "princes of Tver" and "dukes of Moscow".[220]
In August 2022, Wikipedia criticism site Wikipediocracy published an interview with a hoaxer who ten years prior had added a hoax to Wikipedia, claiming that an "Alan MacMasters" had invented the electric toaster. The false information was widely reproduced online as well as in newspapers and books subsequently cited in Wikipedia.[221] [222] [223]
In 2023, Jan Grabowski and Shira Klein published an article in the Journal of Holocaust Research in which they claim to have discovered a "systematic, intentional distortion of Holocaust history" on the English-language Wikipedia.[224] Analysing 25 Wikipedia articles and almost 300 back pages (including talk pages, noticeboards and arbitration cases), Grabowski and Klein believe they have shown how a small group of editors managed to impose a fringe narrative on Polish-Jewish relations, informed by Polish nationalist propaganda and far removed from evidence-driven historical research. In addition to the article on the Warsaw concentration camp, the authors claim that the activities of the editors' group had an effect on several articles, such as History of the Jews in Poland, Rescue of Jews by Poles during the Holocaust and Jew with a coin. Supposed nationalist editing on these and other articles allegedly included content ranging "from minor errors to subtle manipulations and outright lies", examples of which the authors offer.
See main article: Conflict-of-interest editing on Wikipedia.
While Wikipedia policy requires articles to have a neutral point of view, there have been attempts to place a spin on articles. In January 2006 several staffers of members of the U.S. House of Representatives attempted to cleanse their respective bosses' biographies on Wikipedia, and to insert negative remarks on political opponents. References to a campaign promise by Martin Meehan to surrender his seat in 2000 were deleted, and negative comments were inserted into the articles on U.S. Senator Bill Frist and Eric Cantor, a congressman from Virginia. Numerous other changes were made from an IP address which is assigned to the House of Representatives.[225] In an interview, Jimmy Wales remarked that the changes were "not cool".[226]
On August 31, 2008, The New York Times ran an article detailing the edits made to the biography of Sarah Palin in the wake of her nomination as running mate of John McCain. During the 24 hours before the McCain campaign announcement, 30 edits, many of them flattering details, were made to the article by Wikipedia single-purpose user identity Young Trigg. This person later acknowledged working on the McCain campaign, and having several Wikipedia user accounts.[227] [228]
Larry Delay and Pablo Bachelet write that from their perspective, some articles dealing with Latin American history and groups (such as the Sandinistas and Cuba) lack political neutrality and are written from a sympathetic Marxist perspective which treats socialist dictatorships favorably at the expense of alternate positions.[229] [230]
In November 2007, libelous accusations were made against two politicians from southwestern France, Jean-Pierre Grand and Hélène Mandroux-Colas, on their Wikipedia biographies. Jean-Pierre Grand asked the president of the French National Assembly and the prime minister of France to reinforce the legislation on the penal responsibility of Internet sites and of authors who peddle false information in order to cause harm.[231] Senator Jean Louis Masson then requested the Minister of Justice to tell him whether it would be possible to increase the criminal responsibilities of hosting providers, site operators, and authors of libelous content; the minister declined to do so, recalling the existing rules in the LCEN law.[232]
In 2009, Wikipedia banned the Church of Scientology from editing any articles on its site. The Wikipedia articles concerning Scientology were edited by members of the group to improve its portrayal.[233]
On August 25, 2010, the Toronto Star reported that the Canadian "government is now conducting two investigations into federal employees who have taken to Wikipedia to express their opinion on federal policies and bitter political debates."[234]
In 2010, Al Jazeera's Teymoor Nabili suggested that the article Cyrus Cylinder had been edited for political purposes by "an apparent tussle of opinions in the shadowy world of hard drives and 'independent' editors that comprise the Wikipedia industry." He suggested that after the Iranian presidential election, 2009 and the ensuing "anti-Iranian activities" a "strenuous attempt to portray the cylinder as nothing more than the propaganda tool of an aggressive invader" was visible. The edits following his analysis of the edits during 2009 and 2010, represented "a complete dismissal of the suggestion that the cylinder, or Cyrus' actions, represent concern for human rights or any kind of enlightened intent", in stark contrast to Cyrus' own reputation (among the people of Babylon) as written in the Old Testament.[235]
In April 2008, the Boston-based Committee for Accuracy in Middle East Reporting in America (CAMERA) organized an e-mail campaign to encourage readers to correct perceived Israel-related biases and inconsistencies in Wikipedia.[236] Excerpts of some of the e-mails were published in the July 2008 issue of Harper's Magazine under the title of "Candid camera".[237]
CAMERA argued the excerpts were unrepresentative and that it had explicitly campaigned merely "toward encouraging people to learn about and edit the online encyclopedia for accuracy".[238] According to some defenders of CAMERA, serious misrepresentations of CAMERA's role emanated from the competing Electronic Intifada group; moreover, it is said, some other Palestinian advocacy groups have been guilty of systematic misrepresentations and manipulative behaviors but have not suffered bans of editors amongst their staff or volunteers.[239] [240]
Five editors involved in the campaign were sanctioned by Wikipedia administrators.[241] Israeli diplomat David Saranga said that Wikipedia is generally fair in regard to Israel. When confronted with the fact that the entry on Israel mentioned the word "occupation" nine times, whereas the entry on the Palestinian people mentioned "terror" only once, he replied: "It means only one thing: Israelis should be more active on Wikipedia. Instead of blaming it, they should go on the site much more, and try and change it."[242]
Political commentator Haviv Rettig Gur, reviewing widespread perceptions in Israel of systemic bias in the English-language Wikipedia articles, has argued that there are deeper structural problems creating this bias: anonymous editing favors biased results, especially if those Gur calls "pro-Palestinian activists" organize concerted campaigns as has been putatively done in articles dealing with Arab-Israeli issues, and current Wikipedia policies, while well-meant, have proven ineffective in handling this.[243]
On August 3, 2010, it was reported that the Yesha Council together with Israel Sheli (My Israel), a network of online pro-Israel activists committed to spreading Zionism online, were organizing people at a workshop in Jerusalem to teach them how to edit Wikipedia articles in a pro-Israeli way.[244] [245] Around 50 people took part in the course.[246]
The project organiser, Ayelet Shaked, who has since been elected to Israel's parliament, was interviewed on Arutz Sheva Radio. She emphasized that the information has to be reliable and meet Wikipedia rules. She cited some examples such as the use of the term "occupation" in Wikipedia entries, as well as in the editing of entries that link Israel with Judea and Samaria and Jewish history".[247]
"We don't want to change Wikipedia or turn it into a propaganda arm," commented Naftali Bennett, director of the Yesha Council. "We just want to show the other side. People think that Israelis are mean, evil people who only want to hurt Arabs all day."[248] "The idea is not to make Wikipedia rightist but for it to include our point of view," he said in another interview.[246]
A course participant explained that the course is not a "Zionist conspiracy to take over Wikipedia"; rather, it is an attempt to balance information about disputed issues presented in the online encyclopedia.
[T]he goal of this workshop was to train a number of pro-Israelis how to edit Wikipedia so that more people could present the Israeli side of things, and thus the content would be more balanced... Wikipedia is meant to be a fair and balanced source, and it is that way by having people from all across the spectrum contributing to the content.[249]
Following the course announcement, Abdul Nasser An-Najar, the head of Palestinian Journalists Syndicate said there were plans to set up a counter group to ensure the Palestinian view is presented online as the "next regional war will be [a] media war."[248]
In 2011, Wikipedia founder Jimmy Wales stated in retrospect about the course organized by Israel Sheli, "we saw absolutely no impact from that effort whatsoever. I don't think it ever—it was in the press but we never saw any impact."[250]
In January 2012, members of the public relations industry created the Corporate Representatives for Ethical Wikipedia Engagement (CREWE) Facebook group with the stated goal of maintaining accurate articles about corporations.[251]
In an October 2012 Salon story, Wikipedia co-founder Jimmy Wales stated that he was against the practice of paid editing of Wikipedia, as are a number long-time members of Wikipedia's community. Nonetheless, a number of organizations do pay employees to edit Wikipedia articles, with one writer, Soraya Field Fiorio, stating that she writes commissioned Wikipedia articles for writers and musicians for $30 an hour. According to Fiorio, her clients control the article's content in the same way that they control press releases, which function as part of publicity strategies.[252] In January 2007, Rick Jelliffe claimed in a story carried by CBS[253] and IDG News Service[254] [255] that Microsoft had offered him compensation in exchange for his future editorial services on OOXML. A Microsoft spokesperson, quoted by CBS, commented that "Microsoft and the writer, Rick Jelliffe, had not determined a price and no money had changed hands, but they had agreed that the company would not be allowed to review his writing before submission".
In a story covered by the BBC, Jeffrey Merkey claimed that in exchange for a donation his Wikipedia entry was edited in his favor. Jay Walsh, a spokesman for Wikipedia, flatly denied the allegations in an interview given to the Daily Telegraph.[256]
In a story covered by InformationWeek, Eric Goldman, assistant law professor at Santa Clara University in California argued that "eventually, marketers will build scripts to edit Wikipedia pages to insert links and conduct automated attacks on Wikipedia",[257] thus putting the encyclopedia beyond the ability of its editors to provide countermeasures against the attackers, particularly because of a vicious circle where the strain of responding to these attacks drives core contributors away, increasing the strain on those who remain.[258]
In February 2008, British technology news and opinion website The Register stated that a prominent administrator of Wikipedia had edited a topic area where he had a conflict of interest to keep criticism to a bare minimum, as well as altering the Wikipedia policies regarding personal biography and conflict of interest to favour his editing.[259]
Some of the most scathing criticism of Wikipedia's claimed neutrality came in The Register, which in turn was allegedly criticized by founding members of the project. According to The Register: "In short, Wikipedia is a cult. Or at least, the inner circle is a cult. We aren't the first to make this observation. On the inside, they reinforce each other's beliefs. And if anyone on the outside questions those beliefs, they circle the wagons. They deny the facts. They attack the attacker. After our Jossi Fresco story, Fresco didn't refute our reporting. He simply accused us of 'yellow journalism'. After our Overstock.com article, Wales called us 'trash'."[260]
Charles Arthur in The Guardian said that "Wikipedia, and so many other online activities, show all the outward characteristics of a cult."[261]
In February 2015, a longstanding Wikipedia administrator was site-banned after Wikipedia's Arbitration Committee found that they had, over a period of several years, manipulated the content of Wikipedia articles to add positive content and remove negative content about the controversial Indian Institute of Planning and Management and its dean, Arindam Chaudhuri. An Indian journalist commented in Newsweek on the importance of the Wikipedia article to the institute's PR campaign and voiced the opinion that "by letting this go on for so long, Wikipedia has messed up perhaps 15,000 students' lives".[262] [263]
The 2005 Nature study also gave two brief examples of challenges that Wikipedian science writers purportedly faced on Wikipedia. The first concerned the addition of a section on violence to the schizophrenia article, which exhibited the view of one of the article's regular editors, neuropsychologist Vaughan Bell, that it was little more than a "rant" about the need to lock people up, and that editing it stimulated him to look up the literature on the topic.[18]
The second dispute reported by Nature involved the climatologist William Connolley related to protracted disputes between editors of climate change topics, in which Connolley was placed on parole and several opponents banned from editing climate related articles for six months;[18] a separate paper commented that this was more about etiquette than bias and that Connolley did "not suffer fools gladly".[264]
"While estimates of its influence can vary, Wikipedia is probably the most important single source in the training of A.I. models. ... In fact, no one I spoke with in the tech community seemed to know if it would even be possible to build a good A.I. model without Wikipedia."