Canonical correlation explained
In statistics, canonical-correlation analysis (CCA), also called canonical variates analysis, is a way of inferring information from cross-covariance matrices. If we have two vectors X = (X1, ..., Xn) and Y = (Y1, ..., Ym) of random variables, and there are correlations among the variables, then canonical-correlation analysis will find linear combinations of X and Y that have a maximum correlation with each other.[1] T. R. Knapp notes that "virtually all of the commonly encountered parametric tests of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables."[2] The method was first introduced by Harold Hotelling in 1936,[3] although in the context of angles between flats the mathematical concept was published by Camille Jordan in 1875.[4]
CCA is now a cornerstone of multivariate statistics and multi-view learning, and a great number of interpretations and extensions have been proposed, such as probabilistic CCA, sparse CCA, multi-view CCA, Deep CCA, and DeepGeoCCA.[5] Unfortunately, perhaps because of its popularity, the literature can be inconsistent with notation, we attempt to highlight such inconsistencies in this article to help the reader make best use of the existing literature and techniques available.
Like its sister method PCA, CCA can be viewed in population form (corresponding to random vectors and their covariance matrices) or in sample form (corresponding to datasets and their sample covariance matrices). These two forms are almost exact analogues of each other, which is why their distinction is often overlooked, but they can behave very differently in high dimensional settings.[6] We next give explicit mathematical definitions for the population problem and highlight the different objects in the so-called canonical decomposition - understanding the differences between these objects is crucial for interpretation of the technique.
Population CCA definition via correlations
and
of
random variables with finite second moments, one may define the
cross-covariance \SigmaXY=\operatorname{cov}(X,Y)
to be the
matrix whose
entry is the
covariance \operatorname{cov}(xi,yj)
. In practice, we would estimate the covariance matrix based on sampled data from
and
(i.e. from a pair of data matrices).
Canonical-correlation analysis seeks a sequence of vectors
(
) and
(
) such that the random variables
and
maximize the
correlation
. The (scalar) random variables
and
are the
first pair of canonical variables. Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the
second pair of canonical variables. This procedure may be continued up to
times.
The sets of vectors
are called
canonical directions or
weight vectors or simply
weights. The 'dual' sets of vectors
are called
canonical loading vectors or simply
loadings; these are often more straightforward to interpret than the weights.
[7] Computation
Derivation
Let
be the
cross-covariance matrix for any pair of (vector-shaped) random variables
and
. The target function to maximize is
\rho=
| aT\SigmaXYb |
\sqrt{aT\SigmaXXa |
\sqrt{bT\SigmaYYb}}.
The first step is to define a change of basis and define
where
and
can be obtained from the eigen-decomposition (or by diagonalization):
\SigmaXX1/2=VX
VXDX
=\SigmaXX,
and
\SigmaYY1/2=VY
VYDY
=\SigmaYY.
Thus
\rho=
| cT\SigmaXX-1/2\SigmaXY\SigmaYY-1/2d |
\sqrt{cTc |
\sqrt{dTd}}.
By the Cauchy–Schwarz inequality,
\left(cT\SigmaXX-1/2\SigmaXY\SigmaYY-1/2\right)(d)\leq\left(cT\SigmaXX-1/2\SigmaXY\SigmaYY-1/2\SigmaYY-1/2\SigmaYX\SigmaXX-1/2c\right)1/2\left(dTd\right)1/2,
\rho\leq
| | T | | \left(c | | \Sigma\SigmaXY\Sigma\SigmaYXc\right)1/2 |
|
\left(cTc\right)1/2 |
.
There is equality if the vectors
and
are collinear. In addition, the maximum of correlation is attained if
is the
eigenvector with the maximum eigenvalue for the matrix
(see
Rayleigh quotient). The subsequent pairs are found by using
eigenvalues of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.
Another way of viewing this computation is that
and
are the left and right
singular vectors of the correlation matrix of X and Y corresponding to the highest singular value.
Solution
The solution is therefore:
is an eigenvector of
is proportional to
Reciprocally, there is also:
is an eigenvector of
is proportional to
Reversing the change of coordinates, we have that
is an eigenvector of
,
is proportional to
is an eigenvector of
\Sigma
\SigmaYX
\SigmaXY,
is proportional to
.
The canonical variables are defined by:
Implementation
CCA can be computed using singular value decomposition on a correlation matrix.[8] It is available as a function in[9]
CCA computation using singular value decomposition on a correlation matrix is related to the cosine of the angles between flats. The cosine function is ill-conditioned for small angles, leading to very inaccurate computation of highly correlated principal vectors in finite precision computer arithmetic. To fix this trouble, alternative algorithms are available in
Hypothesis testing
Each row can be tested for significance with the following method. Since the correlations are sorted, saying that row
is zero implies all further correlations are also zero. If we have
independent observations in a sample and
is the estimated correlation for
. For the
th row, the test statistic is:
\chi2=-\left(p-1-
(m+n+1)\right)ln
} (1 - \widehat_j^2),
which is asymptotically distributed as a chi-squared with
degrees of freedom for large
.
[11] Since all the correlations from
to
are logically zero (and estimated that way also) the product for the terms after this point is irrelevant.
Note that in the small sample size limit with
then we are guaranteed that the top
correlations will be identically 1 and hence the test is meaningless.
[12] Practical uses
A typical use for canonical correlation in the experimental context is to take two sets of variables and see what is common among the two sets.[13] For example, in psychological testing, one could take two well established multidimensional personality tests such as the Minnesota Multiphasic Personality Inventory (MMPI-2) and the NEO. By seeing how the MMPI-2 factors relate to the NEO factors, one could gain insight into what dimensions were common between the tests and how much variance was shared. For example, one might find that an extraversion or neuroticism dimension accounted for a substantial amount of shared variance between the two tests.
One can also use canonical-correlation analysis to produce a model equation which relates two sets of variables, for example a set of performance measures and a set of explanatory variables, or a set of outputs and set of inputs. Constraint restrictions can be imposed on such a model to ensure it reflects theoretical requirements or intuitively obvious conditions. This type of model is known as a maximum correlation model.[14]
Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of variables for the pairs of canonical variates showing significant correlation. Some authors suggest that they are best visualized by plotting them as heliographs, a circular format with ray like bars, with each half representing the two sets of variables.[15]
Examples
Let
with zero
expected value, i.e.,
.
- If
, i.e.,
and
are perfectly correlated, then, e.g.,
and
, so that the first (and only in this example) pair of canonical variables is
and
.
- If
, i.e.,
and
are perfectly anticorrelated, then, e.g.,
and
, so that the first (and only in this example) pair of canonical variables is
and
.
We notice that in both cases
, which illustrates that the canonical-correlation analysis treats correlated and anticorrelated variables similarly.
Connection to principal angles
Assuming that
and
have zero
expected values, i.e.,
\operatorname{E}(X)=\operatorname{E}(Y)=0
, their
covariance matrices
\SigmaXX=\operatorname{Cov}(X,X)=\operatorname{E}[XXT]
and
\SigmaYY=\operatorname{Cov}(Y,Y)=\operatorname{E}[YYT]
can be viewed as
Gram matrices in an
inner product for the entries of
and
, correspondingly. In this interpretation, the random variables, entries
of
and
of
are treated as elements of a vector space with an inner product given by the
covariance \operatorname{cov}(xi,yj)
; see Covariance#Relationship to inner products.
The definition of the canonical variables
and
is then equivalent to the definition of
principal vectors for the pair of subspaces spanned by the entries of
and
with respect to this
inner product. The canonical correlations
is equal to the
cosine of
principal angles.
Whitening and probabilistic canonical correlation analysis
CCA can also be viewed as a special whitening transformation where the random vectors
and
are simultaneously transformed in such a way that the cross-correlation between the whitened vectors
and
is diagonal.
[16] The canonical correlations are then interpreted as regression coefficients linking
and
and may also be negative. The regression view of CCA also provides a way to construct a latent variable probabilistic generative model for CCA, with uncorrelated hidden variables representing shared and non-shared variability.
See also
External links
Notes and References
- Book: 10.1007/978-3-540-72244-1_14 . Canonical Correlation Analysis . Applied Multivariate Statistical Analysis . 321–330 . 2007 . 978-3-540-72243-4 . Wolfgang . Härdle. Léopold . Simar. 10.1.1.324.403 .
- Knapp . T. R. . Canonical correlation analysis: A general parametric significance-testing system . 10.1037/0033-2909.85.2.410 . Psychological Bulletin . 85 . 2 . 410–416 . 1978 .
- Hotelling . H. . Harold Hotelling. Relations Between Two Sets of Variates . 10.1093/biomet/28.3-4.321 . Biometrika . 28 . 3–4 . 321–377 . 1936 . 2333955.
- Jordan . C. . Camille Jordan . 1875 . Essai sur la géométrie à
dimensions . Bull. Soc. Math. France . 3 . 103 .
- Book: Ju . Ce . Deep Geodesic Canonical Correlation Analysis for Covariance-Based Neuroimaging Data . Kobler . Reinmar J . Tang . Liyao . Guan . Cuntai . Kawanabe . Motoaki . The Twelfth International Conference on Learning Representations (ICLR 2024, spotlight) . 2024.
- Web site: Statistical Learning with Sparsity: the Lasso and Generalizations . 2023-09-12 . hastie.su.domains.
- Gu . Fei . Wu . Hao . 2018-04-01 . Simultaneous canonical correlation analysis with invariant canonical loadings . Behaviormetrika . en . 45 . 1 . 111–132 . 10.1007/s41237-017-0042-8 . 1349-6964.
- Hsu . D. . Kakade . S. M. . Zhang . T. . 10.1016/j.jcss.2011.12.025 . A spectral algorithm for learning Hidden Markov Models . Journal of Computer and System Sciences . 78 . 5 . 1460 . 2012 . 0811.4413. 220740158 .
- Huang . S. Y. . Lee . M. H. . Hsiao . C. K. . 10.1016/j.jspi.2008.10.011 . Nonlinear measures of association with kernel canonical correlation analysis and applications . Journal of Statistical Planning and Inference . 139 . 7 . 2162 . 2009 . 2015-09-04 . 2017-03-13 . https://web.archive.org/web/20170313203427/http://www.stat.sinica.edu.tw/syhuang/papersdownload/KCCA-080906.pdf . dead .
- Chapman . James . Wang . Hao-Ting . 2021-12-18 . CCA-Zoo: A collection of Regularized, Deep Learning based, Kernel, and Probabilistic CCA methods in a scikit-learn style framework . Journal of Open Source Software . en . 6 . 68 . 3823 . 10.21105/joss.03823 . 2475-9066. free . 2021JOSS....6.3823C .
- Book: Kanti V. Mardia, J. T. Kent and J. M. Bibby . Multivariate Analysis . 1979 . Academic Press.
- Yang Song, Peter J. Schreier, David Ram´ırez, and Tanuj Hasija Canonical correlation analysis of high-dimensional data with very small sample support
- Book: Sieranoja, S.. Sahidullah, Md. Kinnunen, T.. Komulainen, J.. Hadid, A.. 2018 IEEE 3rd International Conference on Signal and Image Processing (ICSIP) . Audiovisual Synchrony Detection with Optimized Audio Features . July 2018. 377–381 . 10.1109/SIPROCESS.2018.8600424 . 978-1-5386-6396-7 . 51682024 . http://cs.joensuu.fi/pages/tkinnu/webpage/pdf/audiovisual_synchrony_2018.pdf.
- Tofallis . C. . Model Building with Multiple Dependent Variables and Constraints . 10.1111/1467-9884.00195 . Journal of the Royal Statistical Society, Series D . 48 . 3 . 371–378 . 1999 . 1109.0725. 8942357 .
- Book: Degani . A. . Shafto . M. . Olson . L. . 10.1007/11783183_11 . Canonical Correlation Analysis: Use of Composite Heliographs for Representing Multiple Patterns . Diagrammatic Representation and Inference . Lecture Notes in Computer Science . 4045 . 93 . 2006 . 978-3-540-35623-3 . http://ti.arc.nasa.gov/m/profile/adegani/Composite_Heliographs.pdf. 10.1.1.538.5217 .
- Jendoubi . T. . Strimmer . K. . A whitening approach to probabilistic canonical correlation analysis for omics data integration . BMC Bioinformatics . 20 . 1 . 15 . 2018 . 1802.03490 . 10.1186/s12859-018-2572-9 . 30626338 . 6327589 . free .
- 10.1109/TIFS.2016.2569061. Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition. IEEE Transactions on Information Forensics and Security. 11. 9. 1984–1996. 2016. Haghighat. Mohammad. Abdel-Mottaleb. Mohamed. Alhalabi. Wadee. 15624506 .