In statistics, modes of variation[1] are a continuously indexed set of vectors or functions that are centered at a mean and are used to depict the variation in a population or sample. Typically, variation patterns in the data can be decomposed in descending order of eigenvalues with the directions represented by the corresponding eigenvectors or eigenfunctions. Modes of variation provide a visualization of this decomposition and an efficient description of variation around the mean. Both in principal component analysis (PCA) and in functional principal component analysis (FPCA), modes of variation play an important role in visualizing and describing the variation in the data contributed by each eigencomponent.[2] In real-world applications, the eigencomponents and associated modes of variation aid to interpret complex data, especially in exploratory data analysis (EDA).
Modes of variation are a natural extension of PCA and FPCA.
If a random vector
X=(X1,X2, … ,
T | |
X | |
p) |
\boldsymbol{\mu}p
\Sigmap x
λ1\geqλ2\geq … \geqλp\geq0
e1,e2, … ,ep
\Sigma
\Sigma=QΛQT,
Q
\Sigma
Λ
\Sigma
p\xi | |
X-\boldsymbol{\mu}=\sum | |
ke |
k,
where
\xik=e
T(X-\boldsymbol{\mu}) | |
k |
k
ek
\operatorname{E}(\xik)=0,\operatorname{Var}(\xik)=λk,
\operatorname{E}(\xik\xil)=0 for l ≠ k.
Then the
k
X
\alpha
mk,=\boldsymbol{\mu}\pm\alpha\sqrt{λk}ek,\alpha\in[-A,A],
where
A
2 or 3
X(t),t\inl{T}\subsetRp
p=1
l{T}
\mu(t)=\operatorname{E}(X(t))
G(s,t)=\operatorname{Cov}(X(s),X(t))=
infty | |
\sum | |
k=1 |
λk\varphik(s)\varphik(t),
where
λ1\geqλ2\geq … \geq0
\{\varphi1,\varphi2, … \}
G:L2(l{T}) → L2(l{T}),G(f)=\intl{T}G(s,t)f(s)ds.
By the Karhunen–Loève theorem, one can express the centered function in the eigenbasis,
X(t)-\mu(t)=
infty | |
\sum | |
k=1 |
\xik\varphik(t),
\xik=\intl{T}(X(t)-\mu(t))\varphik(t)dt
is the
k
\operatorname{E}(\xik)=0,\operatorname{Var}(\xik)=λk,
\operatorname{E}(\xik\xil)=0forl\nek.
Then the
k
X(t)
\alpha
mk,(t)=\mu(t)\pm\alpha\sqrt{λk}\varphik(t), t\inl{T}, \alpha\in[-A,A]
that are viewed simultaneously over the range of
\alpha
A=2 or 3
The formulation above is derived from properties of the population. Estimation is needed in real-world applications. The key idea is to estimate mean and covariance.
Suppose the data
x1,x2, … ,xn
n
p
X
\boldsymbol{\mu}
\Sigma
\overline{x
S
(\hat{λ}1,\hat{e
k
X
\hat{m
Consider
n
X1(t),X2(t), … ,Xn(t)
X(t),t\inl{T}
\mu(t)=\operatorname{E}(X(t))
G(s,t)=\operatorname{Cov}(X(s),X(t))
\mu(t)
G(s,t)
k
X(t)
\hat{m}k,(t)=\hat{\mu}(t)\pm\alpha\sqrt{\hat{λ}k}\hat{\varphi}k(t),t\inl{T},\alpha\in[-A,A].
Modes of variation are useful to visualize and describe the variation patterns in the data sorted by the eigenvalues. In real-world applications, modes of variation associated with eigencomponents allow to interpret complex data, such as the evolution of function traits[4] and other infinite-dimensional data.[5] To illustrate how modes of variation work in practice, two examples are shown in the graphs to the right, which display the first two modes of variation. The solid curve represents the sample mean function. The dashed, dot-dashed, and dotted curves correspond to modes of variation with
\alpha=\pm1,\pm2,
\pm3
The first graph displays the first two modes of variation of female mortality data from 41 countries in 2003.[6] The object of interest is log hazard function between ages 0 and 100 years. The first mode of variation suggests that the variation of female mortality is smaller for ages around 0 or 100, and larger for ages around 25. An appropriate and intuitive interpretation is that mortality around 25 is driven by accidental death, while around 0 or 100, mortality is related to congenital disease or natural death.
Compared to female mortality data, modes of variation of male mortality data shows higher mortality after around age 20, possibly related to the fact that life expectancy for women is higher than that for men.