In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.[1] Suppose X is a random variable and that all of the moments
\operatorname{E}(Xk)
exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments(cf. the problem of moments). If
\limn\toinfty
k) | |
\operatorname{E}(X | |
n |
=\operatorname{E}(Xk)
for all values of k, then the sequence converges to X in distribution.
The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.[2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.[3]