In mathematics and its applications, the mean square is normally defined as the arithmetic mean of the squares of a set of numbers or of a random variable.[1]
It may also be defined as the arithmetic mean of the squares of the deviations between a set of numbers and a reference value (e.g., may be a mean or an assumed mean of the data),[2] in which case it may be known as mean square deviation.When the reference value is the assumed true value, the result is known as mean squared error.
A typical estimate for the sample variance from a set of sample values
xi
2 | |
s | |
i-\bar{x}) |
The second moment of a random variable,
E(X2)