Khatri–Rao product explained

A

and

B

is defined as[1] [2]

A\astB=\left(AijBij\right)ij

in which the ij-th block is the sized Kronecker product of the corresponding blocks of A and B, assuming the number of row and column partitions of both matrices is equal. The size of the product is then .

For example, if A and B both are partitioned matrices e.g.:

A=\left[ \begin{array}{c|c} A11&A12\\ \hline A21&A22\end{array} \right] =\left[ \begin{array}{cc|c} 1&2&3\\ 4&5&6\\ \hline 7&8&9 \end{array} \right] , B=\left[ \begin{array}{c|c} B11&B12\\ \hline B21&B22\end{array} \right] =\left[ \begin{array}{c|cc} 1&4&7\\ \hline 2&5&8\\ 3&6&9 \end{array} \right] ,

we obtain:

A\astB=\left[ \begin{array}{c|c} A11B11&A12B12\\ \hline A21B21&A22B22\end{array} \right] = \left[ \begin{array}{cc|cc} 1&2&12&21\\ 4&5&24&42\\ \hline 14&16&45&72\\ 21&24&54&81 \end{array} \right].

This is a submatrix of the Tracy–Singh product[3] of the two matrices (each partition in this example is a partition in a corner of the Tracy–Singh product).

Column-wise Kronecker product

The column-wise Kronecker product of two matrices is a special case of the Khatri-Rao product as defined above, and may also be called the Khatri–Rao product. This product assumes the partitions of the matrices are their columns. In this case,, and for each j: . The resulting product is a matrix of which each column is the Kronecker product of the corresponding columns of A and B. Using the matrices from the previous examples with the columns partitioned:

C=\left[ \begin{array}{c|c|c} C1&C2&C3 \end{array} \right] =\left[ \begin{array}{c|c|c} 1&2&3\\ 4&5&6\\ 7&8&9 \end{array} \right] , D=\left[ \begin{array}{c|c|c} D1&D2&D3 \end{array} \right] =\left[ \begin{array}{c|c|c} 1&4&7\\ 2&5&8\\ 3&6&9 \end{array} \right] ,

so that:

C\astD =\left[ \begin{array}{c|c|c} C1D1&C2D2&C3D3 \end{array} \right] = \left[ \begin{array}{c|c|c} 1&8&21\\ 2&10&24\\ 3&12&27\\ 4&20&42\\ 8&25&48\\ 12&30&54\\ 7&32&63\\ 14&40&72\\ 21&48&81 \end{array} \right].

This column-wise version of the Khatri–Rao product is useful in linear algebra approaches to data analytical processing[4] and in optimizing the solution of inverse problems dealing with a diagonal matrix.[5] [6]

In 1996 the column-wise Khatri–Rao product was proposed to estimate the angles of arrival (AOAs) and delays of multipath signals[7] and four coordinates of signals sources[8] at a digital antenna array.

Face-splitting product

An alternative concept of the matrix product, which uses row-wise splitting of matrices with a given quantity of rows, was proposed by V. Slyusar[9] in 1996.[10] [11] [12] [13] This matrix operation was named the "face-splitting product" of matrices or the "transposed Khatri–Rao product". This type of operation is based on row-by-row Kronecker products of two matrices. Using the matrices from the previous examples with the rows partitioned:

C=\begin{bmatrix} C1\\\hline C2\\\hline C3\\ \end{bmatrix} =\begin{bmatrix} 1&2&3\\\hline 4&5&6\\\hline 7&8&9 \end{bmatrix} , D=\begin{bmatrix} D1\\\hline D2\\\hline D3\\ \end{bmatrix} =\begin{bmatrix} 1&4&7\\\hline 2&5&8\\\hline 3&6&9 \end{bmatrix} ,

the result can be obtained:

C\bullD = \begin{bmatrix} C1D1\\\hlineC2D2\\\hline C3D3\\ \end{bmatrix} = \begin{bmatrix} 1&4&7&2&8&14&3&12&21\\\hline 8&20&32&10&25&40&12&30&48\\\hline 21&42&63&24&48&72&27&54&81 \end{bmatrix}.

Main properties

Examples

\begin{align} &\left(\begin{bmatrix} 1&0\\ 0&1\\ 1&0 \end{bmatrix} \bullet \begin{bmatrix} 1&0\\ 1&0\\ 0&1 \end{bmatrix} \right) \left(\begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \right) \left(\begin{bmatrix} \sigma1&0\\ 0&\sigma2\\ \end{bmatrix} \begin{bmatrix} \rho1&0\\ 0&\rho2\\ \end{bmatrix} \right) \left(\begin{bmatrix} x1\\ x2 \end{bmatrix} \ast \begin{bmatrix} y1\\ y2 \end{bmatrix} \right) \\[5pt] {}={}&\left(\begin{bmatrix} 1&0\\ 0&1\\ 1&0 \end{bmatrix} \bullet \begin{bmatrix} 1&0\\ 1&0\\ 0&1 \end{bmatrix} \right) \left(\begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \begin{bmatrix} \sigma1&0\\ 0&\sigma2\\ \end{bmatrix} \begin{bmatrix} x1\\ x2 \end{bmatrix} \begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \begin{bmatrix} \rho1&0\\ 0&\rho2\\ \end{bmatrix} \begin{bmatrix} y1\\ y2 \end{bmatrix} \right) \\[5pt] {}={}& \begin{bmatrix} 1&0\\ 0&1\\ 1&0 \end{bmatrix} \begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \begin{bmatrix} \sigma1&0\\ 0&\sigma2\\ \end{bmatrix} \begin{bmatrix} x1\\ x2 \end{bmatrix} \circ \begin{bmatrix} 1&0\\ 1&0\\ 0&1 \end{bmatrix} \begin{bmatrix} 1&1\\ 1&-1 \end{bmatrix} \begin{bmatrix} \rho1&0\\ 0&\rho2\\ \end{bmatrix} \begin{bmatrix} y1\\ y2 \end{bmatrix} . \end{align}

Theorem

If

M=T(1)\bullet...\bulletT(c)

, where

T(1),...,T(c)

are independent components a random matrix

T

with independent identically distributed rows

T1,...,Tm\inRd

, such that
2\right]
E\left[(T
1x)

=

2
\left\|x\right\|
2
and

E\left[(T1x)p\right]

1
p

\le\sqrt{ap}\|x\|2

,

then for any vector

x

\left|\left\|Mx\right\|2-\left\|x\right\|2\right|<\varepsilon\left\|x\right\|2

with probability

1-\delta

if the quantity of rows

m=(4a)2c\varepsilon-2log1/\delta+(2ae)\varepsilon-1(log1/\delta)c.

In particular, if the entries of

T

are

\pm1

can get

m=O\left(\varepsilon-2log1/\delta+\varepsilon-1\left(

1
c

log1/\delta\right)c\right)

which matches the Johnson–Lindenstrauss lemma of

m=O\left(\varepsilon-2log1/\delta\right)

when

\varepsilon

is small.

Block face-splitting product

According to the definition of V. Slyusar the block face-splitting product of two partitioned matrices with a given quantity of rows in blocks

A=\left[ \begin{array}{c|c} A11&A12\\ \hline A21&A22\end{array} \right] , B=\left[ \begin{array}{c|c} B11&B12\\ \hline B21&B22\end{array} \right] ,

can be written as :

A[\bull]B=\left[ \begin{array}{c|c} A11\bullB11&A12\bullB12\\ \hline A21\bullB21&A22\bullB22\end{array} \right].

The transposed block face-splitting product (or Block column-wise version of the Khatri–Rao product) of two partitioned matrices with a given quantity of columns in blocks has a view:

A[\ast]B=\left[ \begin{array}{c|c} A11\astB11&A12\astB12\\ \hline A21\astB21&A22\astB22\end{array} \right].

Main properties

  1. Transpose

\left(A[\ast]B\right)sf{T}=bf{A}sf{T}[\bull]Bsf{T}

[14]

Applications

The Face-splitting product and the Block Face-splitting product used in the tensor-matrix theory of digital antenna arrays. These operations are also used in:

See also

References

Notes and References

  1. . 1968 . Solutions to some functional equations and their applications to characterization of probability distributions . . 30 . 167–180 . 2008-08-21 . https://web.archive.org/web/20101023190620/http://sankhya.isical.ac.in/search/30a2/30a2019.html . 2010-10-23 . dead .
  2. Liu . Shuangzhe . 1999 . Matrix Results on the Khatri–Rao and Tracy–Singh Products . Linear Algebra and Its Applications . 289 . 1–3 . 267–277 . 10.1016/S0024-3795(98)10209-4 . free .
  3. Liu. Shuangzhe. Trenkler. Götz . 2008 . Hadamard, Khatri-Rao, Kronecker and other matrix products . International Journal of Information and Systems Sciences. 4. 1. 160–177.
  4. See e.g. H. D. Macedo and J.N. Oliveira. A linear algebra approach to OLAP. Formal Aspects of Computing, 27(2):283–307, 2015.
  5. Lev-Ari. Hanoch. 2005-01-01. Efficient Solution of Linear Matrix Equations with Application to Multistatic Antenna Array Processing. Communications in Information & Systems. EN. 05. 1. 123–130. 1526-7555. 10.4310/CIS.2005.v5.n1.a5. free.
  6. Masiero. B.. Nascimento. V. H.. 2017-05-01. Revisiting the Kronecker Array Transform. IEEE Signal Processing Letters. 24. 5. 525–529. 10.1109/LSP.2017.2674969. 1070-9908. 2017ISPL...24..525M. 14166014.
  7. Vanderveen, M. C., Ng, B. C., Papadias, C. B., & Paulraj, A. (n.d.). Joint angle and delay estimation (JADE) for signals in multipath environments. Conference Record of The Thirtieth Asilomar Conference on Signals, Systems and Computers. – DOI:10.1109/acssc.1996.599145
  8. Slyusar. V. I.. December 27, 1996. End matrix products in radar applications. . Izvestiya VUZ: Radioelektronika . 41 . 3. 71–75.
  9. Anna Esteve, Eva Boj & Josep Fortiana (2009): "Interaction Terms in Distance-Based Regression," Communications in Statistics – Theory and Methods, 38:19, p. 3501 http://dx.doi.org/10.1080/03610920802592860
  10. Slyusar. V. I.. 1997-05-20. Analytical model of the digital antenna array on a basis of face-splitting matrix products. . Proc. ICATT-97, Kyiv. 108–109.
  11. Slyusar. V. I.. 1997-09-15. New operations of matrices product for applications of radars. Proc. Direct and Inverse Problems of Electromagnetic and Acoustic Wave Theory (DIPED-97), Lviv.. 73–74.
  12. Slyusar. V. I.. March 13, 1998. A Family of Face Products of Matrices and its Properties. Cybernetics and Systems Analysis C/C of Kibernetika I Sistemnyi Analiz. 1999.. 35. 3. 379–384. 10.1007/BF02733426. 119661450.
  13. Slyusar. V. I.. 2003. Generalized face-products of matrices in models of digital antenna arrays with nonidentical channels. Radioelectronics and Communications Systems. 46. 10. 9–17.
  14. Vadym Slyusar. New Matrix Operations for DSP (Lecture). April 1999. – DOI: 10.13140/RG.2.2.31620.76164/1
  15. Bryan Bischof. Higher order co-occurrence tensors for hypergraphs via face-splitting. Published 15 February 2020, Mathematics, Computer Science, ArXiv
  16. Johannes W. R. Martini, Jose Crossa, Fernando H. Toledo, Jaime Cuevas. On Hadamard and Kronecker products in covariance structures for genotype x environment interaction.//Plant Genome. 2020;13:e20033. Page 5. https://doi.org/10.1002/tpg2.20033