In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (with respect to the probability measure).[1] In other words, the set of outcomes on which the event does not occur has probability 0, even though the set might not be empty. The concept is analogous to the concept of "almost everywhere" in measure theory. In probability experiments on a finite sample space with a non-zero probability for each outcome, there is no difference between almost surely and surely (since having a probability of 1 entails including all the sample points); however, this distinction becomes important when the sample space is an infinite set,[2] because an infinite set can have non-empty subsets of probability 0.
Some examples of the use of this concept include the strong and uniform versions of the law of large numbers, the continuity of the paths of Brownian motion, and the infinite monkey theorem. The terms almost certainly (a.c.) and almost always (a.a.) are also used. Almost never describes the opposite of almost surely: an event that happens with probability zero happens almost never.[3]
Let
(\Omega,l{F},P)
E\inl{F}
P(E)=1
E
E
P(EC)=0
E\subseteq\Omega
l{F}
EC
N
lF
P
E
\left(P\right)
In general, an event can happen "almost surely", even if the probability space in question includes outcomes which do not belong to the event—as the following examples illustrate.
Imagine throwing a dart at a unit square (a square with an area of 1) so that the dart always hits an exact point in the square, in such a way that each point in the square is equally likely to be hit. Since the square has area 1, the probability that the dart will hit any particular subregion of the square is equal to the area of that subregion. For example, the probability that the dart will hit the right half of the square is 0.5, since the right half has area 0.5.
Next, consider the event that the dart hits exactly a point in the diagonals of the unit square. Since the area of the diagonals of the square is 0, the probability that the dart will land exactly on a diagonal is 0. That is, the dart will almost never land on a diagonal (equivalently, it will almost surely not land on a diagonal), even though the set of points on the diagonals is not empty, and a point on a diagonal is no less possible than any other point.
See also: Infinite monkey theorem. Consider the case where a (possibly biased) coin is tossed, corresponding to the probability space
(\{H,T\},2\{H,
\{H\}
\{T\}
P(H)=p\in(0,1)
P(T)=1-p
Now, suppose an experiment were conducted where the coin is tossed repeatedly, with outcomes
\omega1,\omega2,\ldots
(Xi)i\inN
Xi(\omega)=\omegai
Xi
i
In this case, any infinite sequence of heads and tails is a possible outcome of the experiment. However, any particular infinite sequence of heads and tails has probability 0 of being the exact outcome of the (infinite) experiment. This is because the i.i.d. assumption implies that the probability of flipping all heads over
n
P(Xi=H, i=1,2,...,n)=\left(P(X1=H)\right)n=pn
n → infty
p\in(0,1)
p
Moreover, the event "the sequence of tosses contains at least one
T
p1,000,000
1-p1,000,000
In asymptotic analysis, a property is said to hold asymptotically almost surely (a.a.s.) if over a sequence of sets, the probability converges to 1. This is equivalent to convergence in probability. For instance, in number theory, a large number is asymptotically almost surely composite, by the prime number theorem; and in random graph theory, the statement "
G(n,pn)
G(n,p)
n
p
\varepsilon>0
pn>
(1+\varepsilon)lnn | |
n. |
In number theory, this is referred to as "almost all", as in "almost all numbers are composite". Similarly, in graph theory, this is sometimes referred to as "almost surely".[7]