Functional correlation explained
In statistics, functional correlation is a dimensionality reduction technique used to quantify the correlation and dependence between two variables when the data is functional. Several approaches have been developed to quantify the relation between two functional variables.
Overview
A pair of real valued random functions
and
with
, a compact interval, can be viewed as realizations of
square-integrable stochastic process in a
Hilbert space. Since both
and
are infinite dimensional, some kind of
dimension reduction is required to explore their relationship. Notions of correlation for functional data include the following.
[1] Functional canonical correlation coefficient (FCCA)
FCCA is a direct extension of multivariate canonical correlation. For a pair of random functions
and
the first canonical coefficient
is defined as:where
denotes the
inner product in
Lp space (p=2) i.e.
\langlef1,f2\rangle=\intl{I
} f_1(t) f_2(t) \, dt,
The
canonical coefficient
, given
\rho1,\rho2,\ldots,\rhok-1
is defined as:
where
(Uk,Vk)=(\langleuk,X\rangle,\langlevk,Y\rangle)
is uncorrelated with all previous pairs
(Uj,Vj)=(\langleuj,X\rangle,\langlevj,Y\rangle)j=1,2,\ldots,k-1
.
Thus FCCA implements projections in the directions of
and
for
and
respectively, such that their linear combinations (inner products)
are maximally correlated.
and
are uncorrelated if all their canonical correlations are zero, equivalently, if and only if
.
Alternative formulation
The cross-covariance operator for two random functions
and
defined as
\SigmaXY:
→
:\SigmaXYv(t)=\int\operatorname{cov}(X(t),Y(t))v(s)ds; v\in
and analogously the auto covariance operators for
, for
and using
\operatorname{cov}(\langleu,X\rangle,\langlev,Y\rangle)=\langleu,\SigmaXYY\rangle
, the
canonical coefficient
in (2) can be re-written as,,where
(Uk,Vk)=(\langleuk,X\rangle,\langlevk,Y\rangle)
is uncorrelated with all previous pairs
(Uj,Vj)=(\langleuj,X\rangle,\langlevj,Y\rangle)j=1,2,..,k-1
Maximizing (3) is equivalent to finding eigenvalues and eigenvectors of the operator
.
Challenges
Since
and
are
compact operators, the square root of the auto-covariance operator of
processes may not be invertible. So the existence of
and hence computing its
eigenvalues and eigenvectors is an ill-posed problem. As a consequence of this inverse problem, overfitting may occur which may lead to an unstable correlation coefficient. Due to this inverse problem,
tends to be biased upwards and therefore close to 1 and hence is difficult to interpret. FCCA also requires densely recorded functional data so that the inner products in (2) can be accurately evaluated.
Possible solutions
Some possible solutions to this problem have been discussed.
- By restricting the maximization of (1) to discrete
sequence spaces that are restricted to a
reproducing kernel Hilbert space instead of entire
[2] [3]
Functional singular correlation analysis (FSCA)
FSCA bypasses the inverse problem by simply replacing the objective function by covariance in place of correlation in (2). FSCA aims to quantify the dependency of
by implementing the concept of functional singular-value decomposition for the
cross-covariance operator. FSCA can be viewed as an extension of analyses using singular-value decomposition of vector data to functional data. For a pair of random functions
and
with smooth mean functions
and
and smooth covariance functions, FSCA aims at a "functional covariance" corresponding to the first singular value of the cross-covariance operator
, which is attained at functions
. A standardized version of this serves as a functional correlation and is defined as,The singular representation of the cross-covariance can be employed to find a solution to the maximization problem (4).
[5] Analogously, we can extend this concept to find the next
ordered singular correlation coefficients
.
Correlation as angle between functions
In the multivariate case, the inner product of two vectors
and
is defined as,
\langlea,b\rangle=\|a\|\|b\|\cos\alpha
where
is the angle between
and
. This can be extended to the space of square integrable random functions. For this notion to be a meaningful measure of alignment of shapes, the integrals of the functions, which are the projections on the constant function 1, are subtracted. This part corresponds to a "static part" and the remainder can be thought of as a "dynamic part" for each random function. The cosine of the
angle between these "dynamic parts" then provides a correlation measure of functional shapes. Denoting
and
the standardized curves may be defined as
[6] [7] and the correlation is defined as,
See also
Notes and References
- Wang JL, Chiou JM, Müller HG. (2015). "Review of functional data analysis". Annual Reviews Statistics. doi=10.1146
- Eubank RL, Hsing T. (2008). "Canonical correlation for stochastic processes". Stochastic Processes and their applications 118:1634-1661.
- Yuan M, Cai TT. (2010). "A reproducing kernel Hilbert space approach to functional linear regression". The Annals of Statistics. 38:3412-3444.
- He G, Müller HG, Wang JL. (2004). "Methods of canonical analysis for functional data".Journal of Statistical Planning and Inference 122(1):141–159
- Yang W, Müller HG, Stadtmüller U. (2011). "Functional singular component analysis". Journal of Royal Statistical Society. 68(1):3–25
- Dubin JA, Müller HG. (2005). "Dynamic correlation for multivariate longitudinal data". Journal of American Statistical Association. 100(471):872–881
- Opgen-Rhein R, Strimmer K. (2006). "Inferring gene dependency networks from genomic longitudinal data : a function data approach". REVSTAT – Statistical Journal. 4:53–65