A proof is sufficient evidence or a sufficient argument for the truth of a proposition.[1] [2] [3] [4]
The concept applies in a variety of disciplines,[5] with both the nature of the evidence or justification and the criteria for sufficiency being area-dependent. In the area of oral and written communication such as conversation, dialog, rhetoric, etc., a proof is a persuasive perlocutionary speech act, which demonstrates the truth of a proposition.[6] In any area of mathematics defined by its assumptions or axioms, a proof is an argument establishing a theorem of that area via accepted rules of inference starting from those axioms and from other previously established theorems.[7] The subject of logic, in particular proof theory, formalizes and studies the notion of formal proof.[8] In some areas of epistemology and theology, the notion of justification plays approximately the role of proof,[9] while in jurisprudence the corresponding term is evidence,[10] with "burden of proof" as a concept common to both philosophy and law.
In most disciplines, evidence is required to prove something. Evidence is drawn from the experience of the world around us, with science obtaining its evidence from nature,[11] law obtaining its evidence from witnesses and forensic investigation,[12] and so on. A notable exception is mathematics, whose proofs are drawn from a mathematical world begun with axioms and further developed and enriched by theorems proved earlier.
Exactly what evidence is sufficient to prove something is also strongly area-dependent, usually with no absolute threshold of sufficiency at which evidence becomes proof.[13] [14] In law, the same evidence that may convince one jury may not persuade another. Formal proof provides the main exception, where the criteria for proofhood are ironclad and it is impermissible to defend any step in the reasoning as "obvious" (except for the necessary ability of the one proving and the one being proven to, to correctly identify any symbol used in the proof.);[15] for a well-formed formula to qualify as part of a formal proof, it must be the result of applying a rule of the deductive apparatus of some formal system to the previous well-formed formulae in the proof sequence.[16]
Proofs have been presented since antiquity. Aristotle used the observation that patterns of nature never display the machine-like uniformity of determinism as proof that chance is an inherent part of nature.[17] On the other hand, Thomas Aquinas used the observation of the existence of rich patterns in nature as proof that nature is not ruled by chance.[18]
Proofs need not be verbal. Before Copernicus, people took the apparent motion of the Sun across the sky as proof that the Sun went round the Earth.[19] Suitably incriminating evidence left at the scene of a crime may serve as proof of the identity of the perpetrator. Conversely, a verbal entity need not assert a proposition to constitute a proof of that proposition. For example, a signature constitutes direct proof of authorship; less directly, handwriting analysis may be submitted as proof of authorship of a document.[20] Privileged information in a document can serve as proof that the document's author had access to that information; such access might in turn establish the location of the author at certain time, which might then provide the author with an alibi.
18th-century Scottish philosopher David Hume built on Aristotle's separation of belief from knowledge,[21] recognizing that one can be said to "know" something only if one has firsthand experience with it, in a strict sense proof, while one can infer that something is true and therefore "believe" it without knowing, via evidence or supposition. This speaks to one way of separating proof from evidence:
If one cannot find their chocolate bar, and sees chocolate on their napping roommate's face, this evidence can cause one to believe their roommate ate the chocolate bar. But they do not know their roommate ate it. It may turn out that the roommate put the candy away when straightening up, but was thus inspired to go eat their own chocolate. Only if one directly experiences proof of the roommate eating it, perhaps by walking in on them doing so, does one know the roommate did it.
In an absolute sense, one can be argued not to "know" anything, except for the existence of one's own thoughts, as 17th-century philosopher John Locke pointed out.[22] Even earlier, Descartes addressed when saying cogito, ergo sum (I think, therefore I am). While Descartes was attempting to "prove" logically that the world exists, his legacy in doing so is to have shown that one cannot have such proof, because all of one's perceptions could be false (such as under the evil demon or simulated reality hypotheses). But one at least has proof of one's own thoughts existing, and strong evidence that the world exists, enough to be considered "proof" by practical standards, though always indirect and impossible to objectively confirm.