In statistics, the antithetic variates method is a variance reduction technique used in Monte Carlo methods. Considering that the error in the simulated signal (using Monte Carlo methods) has a one-over square root convergence, a very large number of sample paths is required to obtain an accurate result. The antithetic variates method reduces the variance of the simulation results.[1] [2]
The antithetic variates technique consists, for every sample path obtained, in taking its antithetic path - that is given a path
\{\varepsilon1,...,\varepsilonM\}
\{-\varepsilon1,...,-\varepsilonM\}
Suppose that we would like to estimate
\theta=E(h(X))=E(Y)
For that we have generated two samples
Y1andY2
An unbiased estimate of
{\theta}
\hat\theta=
Y1+Y2 | |
2 |
.
And
Var(\hat\theta)=
Var(Y1)+Var(Y2)+2Cov(Y1,Y2) | |
4 |
so variance is reduced if
Cov(Y1,Y2)
If the law of the variable X follows a uniform distribution along [0, 1], the first sample will be
u1,\ldots,un
ui
u'1,\ldots,u'n
u'i=1-ui
ui
u'i
We would like to estimate
I=
1 | |
\int | |
0 |
1 | |
1+x |
dx.
The exact result is
I=ln2 ≈ 0.69314718
f(U)
f(x)=
1 | |
1+x |
and U follows a uniform distribution [0, 1].
The following table compares the classical Monte Carlo estimate (sample size: 2n, where n = 1500) to the antithetic variates estimate (sample size: n, completed with the transformed sample 1 - ui):
Estimate | standard error | ||
Classical Estimate | 0.69365 | 0.00255 | |
Antithetic Variates | 0.69399 | 0.00063 |
The use of the antithetic variates method to estimate the result shows an important variance reduction.