See also: Cross-correlation.
\left\{Xt\right\}
\left\{Yt\right\}
\operatornameE
\muX(t)=\operatorname\operatornameE[Xt]
\muY(t)=\operatornameE[Yt]
\operatorname{K}XY(t1,t2)=\operatorname{cov}
(X | |
t1 |
,
Y | |
t2 |
)=
\operatorname{E}[(X | |
t1 |
-\muX(t1))(Y
t2 |
-\muY(t2))]=
\operatorname{E}[X | |
t1 |
Y | |
t2 |
]-\muX(t1)\muY(t2).
Cross-covariance is related to the more commonly used cross-correlation of the processes in question.
In the case of two random vectors
X=(X1,X2,\ldots,
\rmT | |
X | |
p) |
Y=(Y1,Y2,\ldots,
\rmT | |
Y | |
q) |
p x q
\operatorname{K}XY
\operatorname{cov}(X,Y)
\operatorname{K}XY(j,k)=\operatorname{cov}(Xj,Yk).
X
X
In signal processing, the cross-covariance is often called cross-correlation and is a measure of similarity of two signals, commonly used to find features in an unknown signal by comparing it to a known one. It is a function of the relative time between the signals, is sometimes called the sliding dot product, and has applications in pattern recognition and cryptanalysis.
See main article: Cross-covariance matrix.
The definition of cross-covariance of random vectors may be generalized to stochastic processes as follows:
Let
\{X(t)\}
\{Y(t)\}
KXY
where
\muX(t)=\operatorname{E}\left[X(t)\right]
\muY(t)=\operatorname{E}\left[Y(t)\right]
If the processes are complex-valued stochastic processes, the second factor needs to be complex conjugated:
\operatorname{K}XY(t1,t2)\stackrel{def
If
\left\{Xt\right\}
\left\{Yt\right\}
\muX(t1)=\muX(t2)\triangleq\muX
t1,t2
\muY(t1)=\muY(t2)\triangleq\muY
t1,t2
and
\operatorname{K}XY(t1,t2)=\operatorname{K}XY(t2-t1,0)
t1,t2
By setting
\tau=t2-t1
\operatorname{K}XY(\tau)=\operatorname{K}XY(t2-t1)\triangleq\operatorname{K}XY(t1,t2)
The cross-covariance function of two jointly WSS processes is therefore given by:
which is equivalent to
\operatorname{K}XY(\tau)=\operatorname{cov}(Xt+\tau,Yt)=\operatorname{E}[(Xt+-\muX)(Yt-\muY)]=\operatorname{E}[Xt+\tauYt]-\muX\muY
Two stochastic processes
\left\{Xt\right\}
\left\{Yt\right\}
\operatorname{K}XY(t1,t2)
\left\{Xt\right\},\left\{Yt\right\}uncorrelated \iff \operatorname{K}XY(t1,t2)=0 \forallt1,t2
The cross-covariance is also relevant in signal processing where the cross-covariance between two wide-sense stationary random processes can be estimated by averaging the product of samples measured from one process and samples measured from the other (and its time shifts). The samples included in the average can be an arbitrary subset of all the samples in the signal (e.g., samples within a finite time window or a sub-sampling of one of the signals). For a large number of samples, the average converges to the true covariance.
Cross-covariance may also refer to a "deterministic" cross-covariance between two signals. This consists of summing over all time indices. For example, for discrete-time signals
f[k]
g[k]
(f\starg)[n] \stackrel{def
where the line indicates that the complex conjugate is taken when the signals are complex-valued.
f(x)
g(x)
(f\starg)(x) \stackrel{def
The (deterministic) cross-covariance of two continuous signals is related to the convolution by
(f\starg)(t)=(\overline{f(-\tau)}*g(\tau))(t)
and the (deterministic) cross-covariance of two discrete-time signals is related to the discrete convolution by
(f\starg)[n]=(\overline{f[-k]}*g[k])[n]