Coefficient of multiple correlation explained

In statistics, the coefficient of multiple correlation is a measure of how well a given variable can be predicted using a linear function of a set of other variables. It is the correlation between the variable's values and the best predictions that can be computed linearly from the predictive variables.[1]

The coefficient of multiple correlation takes values between 0 and 1. Higher values indicate higher predictability of the dependent variable from the independent variables, with a value of 1 indicating that the predictions are exactly correct and a value of 0 indicating that no linear combination of the independent variables is a better predictor than is the fixed mean of the dependent variable.[2]

Correlation Coefficient (r)Direction and Strength of Correlation
1Perfectly positive
0.8Strongly positive
0.5Moderately positive
0.2Weakly positive
0No association
-0.2Weakly negative
-0.5Moderately negative
-0.8Strongly negative
-1Perfectly negative
The coefficient of multiple correlation is known as the square root of the coefficient of determination, but under the particular assumptions that an intercept is included and that the best possible linear predictors are used, whereas the coefficient of determination is defined for more general cases, including those of nonlinear prediction and those in which the predicted values have not been derived from a model-fitting procedure.

Definition

The coefficient of multiple correlation, denoted R, is a scalar that is defined as the Pearson correlation coefficient between the predicted and the actual values of the dependent variable in a linear regression model that includes an intercept.

Computation

c=

{(r
x1y

,

r
x2y
,...,r
xNy

)}\top

of correlations
r
xny
between the predictor variables

xn

(independent variables) and the target variable

y

(dependent variable), and the correlation matrix

Rxx

of correlations between predictor variables. It is given by

R2=c\top

-1
R
xx

c,

where

c\top

is the transpose of

c

, and
-1
R
xx
is the inverse of the matrix

Rxx=\left(\begin{array}{cccc}

r
x1x1

&

r
x1x2

&...&

r
x1xN

\\

r
x2x1

&\ddots&&\vdots\\ \vdots&&\ddots&\\

r
xNx1

&...&&

r
xNxN

\end{array}\right).

If all the predictor variables are uncorrelated, the matrix

Rxx

is the identity matrix and

R2

simply equals

c\topc

, the sum of the squared correlations with the dependent variable. If the predictor variables are correlated among themselves, the inverse of the correlation matrix

Rxx

accounts for this.

The squared coefficient of multiple correlation can also be computed as the fraction of variance of the dependent variable that is explained by the independent variables, which in turn is 1 minus the unexplained fraction. The unexplained fraction can be computed as the sum of squares of residuals - that is, the sum of the squares of the prediction errors - divided by the sum of squares of deviations of the values of the dependent variable from its expected value.

Properties

With more than two variables being related to each other, the value of the coefficient of multiple correlation depends on the choice of dependent variable: a regression of

y

on

x

and

z

will in general have a different

R

than will a regression of

z

on

x

and

y

. For example, suppose that in a particular sample the variable

z

is uncorrelated with both

x

and

y

, while

x

and

y

are linearly related to each other. Then a regression of

z

on

y

and

x

will yield an

R

of zero, while a regression of

y

on

x

and

z

will yield a strictly positive

R

. This follows since the correlation of

y

with its best predictor based on

x

and

z

is in all cases at least as large as the correlation of

y

with its best predictor based on

x

alone, and in this case with

z

providing no explanatory power it will be exactly as large.

Further reading

Notes and References

  1. http://onlinestatbook.com/2/regression/multiple_regression.html Introduction to Multiple Regression
  2. http://mtweb.mtsu.edu/stats/regression/level3/multicorrel/multicorrcoef.htm Multiple correlation coefficient