In linear algebra, the generalized singular value decomposition (GSVD) is the name of two different techniques based on the singular value decomposition (SVD). The two versions differ because one version decomposes two matrices (somewhat like the higher-order or tensor SVD) and the other version uses a set of constraints imposed on the left and right singular vectors of a single-matrix SVD.
The generalized singular value decomposition (GSVD) is a matrix decomposition on a pair of matrices which generalizes the singular value decomposition. It was introduced by Van Loan in 1976 and later developed by Paige and Saunders, which is the version described here. In contrast to the SVD, the GSVD decomposes simultaneously a pair of matrices with the same number of columns. The SVD and the GSVD, as well as some other possible generalizations of the SVD,[1] [2] [3] are extensively used in the study of the conditioning and regularization of linear systems with respect to quadratic semi-norms. In the following, let
F=R
F=C
The generalized singular value decomposition of matrices
A1\in
m1 x n | |
F |
A2\in
m2 x n | |
F |
U1\in
m1 x m1 | |
F |
U2\in
m2 x m2 | |
F |
Q\inFn
W\inFk
D\inRk
C=\begin{bmatrix}A1\ A2\end{bmatrix}
0D=0\inRk
\Sigma1=\lceilIA,S1,0A\rfloor\in
m1 x k | |
R |
S1=\lceil\alphar,...,\alphar\rfloor
1>\alphar\ge … \ge\alphar>0
IA=Ir
0A=0\in
(m1-r-s) x (k-r-s) | |
R |
\Sigma2=\lceil0B,S2,IB\rfloor\in
m2 x k | |
R |
S2=\lceil\betar,...,\betar\rfloor
0<\betar\le … \le\betar<1
IB=Ik
0B=0\in
(m2-k+r) x r | |
R |
* | |
\Sigma | |
1 |
\Sigma1=
2, | |
\lceil\alpha | |
1 |
...,
2\rfloor | |
\alpha | |
k |
* | |
\Sigma | |
2 |
\Sigma2=
2, | |
\lceil\beta | |
1 |
...,
2\rfloor | |
\beta | |
k |
* | |
\Sigma | |
1 |
\Sigma1+
* | |
\Sigma | |
2 |
\Sigma2=Ik
k=rm{rank}(C)
We denote
\alpha1= … =\alphar=1
\alphar= … =\alphak=0
\beta1= … =\betar=0
\betar= … =\betak=1
\Sigma1
\Sigma2
\Sigma2
There are many variations of the GSVD. These variations are related to the fact that it is always possible to multiply
Q*
EE*=I
E\inFn
X=([W*D,0D]Q*)*
X*=[0,R]\hat{Q}*
R\inFk
\hat{Q}\inFn
Y=W*D
Y
Here are some variations of the GSVD:
\beginA_1 & = U_1 \Sigma_1 X^*, \\A_2 & = U_2 \Sigma_2 X^*.\end
\beginA_1 & = U_1 \Sigma_1 [0, R] \hat^*, \\A_2 & = U_2 \Sigma_2 [0, R] \hat^*.\end
\beginA_1 & = U_1\Sigma_1 [Y, 0_D] Q^*, \\A_2 & = U_2\Sigma_2 [Y, 0_D] Q^*.\end
A generalized singular value of
A1
A2
(a,b)\inR2
We have
Ai
* | |
A | |
j |
=Ui\SigmaiYY*
* | |
\Sigma | |
j |
* | |
U | |
j |
* | |
A | |
i |
Aj=Q\begin{bmatrix}Y*
* | |
\Sigma | |
i |
\SigmajY&0\ 0&0\end{bmatrix}Q*=Q1Y*
* | |
\Sigma | |
i |
\SigmajY
* | |
Q | |
1 |
By these properties we can show that the generalized singular values are exactly the pairs
(\alphai,\betai)
\begin{aligned} {}&\lim\delta\det(b2
* | |
A | |
1 |
A1-a2
* | |
A | |
2 |
A2+\deltaIn)/\det(\deltaIn)\\ =&\lim\delta\det(Y*(b2
* | |
\Sigma | |
1 |
\Sigma1-a2
* | |
\Sigma | |
2 |
\Sigma2)Y+\deltaIk)\\ =&\det(Y*(b2
* | |
\Sigma | |
1 |
\Sigma1-a2
* | |
\Sigma | |
2 |
\Sigma2)Y)\\ =&|\det(Y)|2
k | |
\prod | |
i=1 |
(b2
2 | |
\alpha | |
i |
-a2
2). \end{aligned} | |
\beta | |
i |
a=\alphai
b=\betai
i
In, the generalized singular values are claimed to be those which solve
\det(b2
* | |
A | |
1 |
A1-a2
* | |
A | |
2 |
A2)=0
k=n
(a,b)\inR2
\delta=0
Define
E+=E-1
E\inFn
0+=0*
0\inFm
\left\lceilE1,E2\right\rfloor+=\left\lceil
+, | |
E | |
1 |
+ | |
E | |
2 |
\right\rfloor
+ | |
A | |
i |
Ai
\{1,2,3\}
Ai
+ | |
(A | |
i |
* | |
A | |
i) |
=
+ | |
A | |
i |
Ai
(AB)+=B+A+
Suppose
Q=\begin{bmatrix}Q1&Q2\end{bmatrix}
Q1\inFn
Q2\inFn
+ | |
\Sigma | |
1 |
=\lceilIA,
-1 | |
S | |
1 |
,
T | |
0 | |
A |
\rfloor
+ | |
\Sigma | |
2 |
=\lceil
T | |
0 | |
B, |
-1 | |
S | |
2 |
,IB\rfloor
\Sigma1
+ | |
\Sigma | |
1 |
=\lceilI,I,0\rfloor
\Sigma2
+ | |
\Sigma | |
2 |
=\lceil0,I,I\rfloor
\Sigma1
+ | |
\Sigma | |
2 |
=\lceil0,S1
-1 | |
S | |
2 |
,0\rfloor
+ | |
\Sigma | |
1 |
\Sigma2=\lceil0,
-1 | |
S | |
1 |
S2,0\rfloor
Ai
+ | |
A | |
j |
=Ui\Sigmai
+ | |
\Sigma | |
j |
* | |
U | |
j |
+ | |
A | |
i |
Aj=Q\begin{bmatrix}Y-1
+ | |
\Sigma | |
i |
\SigmajY&0\ 0&0\end{bmatrix}Q*=Q1Y-1
+ | |
\Sigma | |
i |
\SigmajY
* | |
Q | |
1 |
A generalized singular ratio of
A1
A2
\sigmai=\alphai
+ | |
\beta | |
i |
A1
+ | |
A | |
2 |
=U1\Sigma1
+ | |
\Sigma | |
2 |
* | |
U | |
2 |
\Sigma1
+ | |
\Sigma | |
2 |
=\lceil0,S1
-1 | |
S | |
2 |
,0\rfloor
A2
\Sigma1
+ | |
\Sigma | |
2 |
U1
U2
A1
+ | |
A | |
2 |
=A1
-1 | |
A | |
2 |
A1
-1 | |
A | |
2 |
AB-1
B
A2
U1\Sigma1
+ | |
\Sigma | |
2 |
* | |
U | |
2 |
A1
+ | |
A | |
2 |
U1\Sigma1
+ | |
\Sigma | |
2 |
* | |
U | |
2 |
=(U1P1)
* | |
P | |
1 |
\Sigma1
+ | |
\Sigma | |
2 |
P2
* | |
(P | |
2 |
*) | |
U | |
2 |
P1
P2
rank(A1
+)=s | |
A | |
2 |
Let
C=P\lceilD,0\rfloorQ*
C=\begin{bmatrix}A1\ A2\end{bmatrix}
P\in
(m1+m2) x (m1 x m2) | |
F |
Q
D
P=[P1,P2]
P1\in
(m1+m2) x k | |
F |
P2\in
(m1+m2) x (n-k) | |
F |
P1=\begin{bmatrix}P11\ P21\end{bmatrix}
P11\in
m1 x k | |
F |
P21\in
m2 x k | |
F |
P11=U1\Sigma1W*
P11
U1
\Sigma1
W
P21W=U2\Sigma2
U2
\Sigma2
ThenWe also haveThereforeSince
P1
||P1||2\leq1
x\inRk
||x||2=1
||P21||2\leq1
The GSVD, formulated as a comparative spectral decomposition,[4] has been successfully applied to signal processing and data science, e.g., in genomic signal processing.[5] [6] [7]
These applications inspired several additional comparative spectral decompositions, i.e., the higher-order GSVD (HO GSVD) and the tensor GSVD.
It has equally found applications to estimate the spectral decompositions of linear operators when the eigenfunctions are parameterized with a linear model, i.e. a reproducing kernel Hilbert space.[8]
The weighted version of the generalized singular value decomposition (GSVD) is a constrained matrix decomposition with constraints imposed on the left and right singular vectors of the singular value decomposition.[9] [10] [11] This form of the GSVD is an extension of the SVD as such. Given the SVD of an m×n real or complex matrix M
M=U\SigmaV*
U*WuU=V*WvV=I.
U
V
Wu
Wv
Wu
Wv
The weighted form of the GSVD is called as such because, with the correct selection of weights, it generalizes many techniques (such as multidimensional scaling and linear discriminant analysis).[12]