In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If
Z1,\ldots,Zn
\left(Z1,\ldots,Zn\right)
Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.
Applications of complex random vectors are found in digital signal processing.
A complex random vector
Z=(Z1,\ldots,Z
T | |
n) |
(\Omega,l{F},P)
Z\colon\Omega → Cn
(\Re{(Z1)},\Im{(Z1)},\ldots,\Re{(Zn)},\Im{(Z
T | |
n)}) |
(\Omega,l{F},P)
\Re{(z)}
z
\Im{(z)}
z
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form
P(Z\leq1+3i)
P(\Re{(Z)}\leq1,\Im{(Z)}\leq3)
FZ:Cn\mapsto[0,1]
Z=(Z1,...,Z
T | |
n) |
where
z=(z1,...,z
T | |
n) |
As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.
The covariance matrix (also called second central moment)
\operatorname{K}ZZ
n x 1
n x n
(i,j)
\operatorname{K}ZZ= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Z1-\operatorname{E}[Z1])\overline{(Z2-\operatorname{E}[Z2])}]& … &E[(Z1-\operatorname{E}[Z1])\overline{(Zn-\operatorname{E}[Zn])}]\ \\ E[(Z2-\operatorname{E}[Z2])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Z2-\operatorname{E}[Z2])\overline{(Z2-\operatorname{E}[Z2])}]& … &E[(Z2-\operatorname{E}[Z2])\overline{(Zn-\operatorname{E}[Zn])}]\ \\ \vdots&\vdots&\ddots&\vdots\ \\ E[(Zn-\operatorname{E}[Zn])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Zn-\operatorname{E}[Zn])\overline{(Z2-\operatorname{E}[Z2])}]& … &E[(Zn-\operatorname{E}[Zn])\overline{(Zn-\operatorname{E}[Zn])}] \end{bmatrix}
The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.
\operatorname{J}ZZ= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])(Z1-\operatorname{E}[Z1])]&E[(Z1-\operatorname{E}[Z1])(Z2-\operatorname{E}[Z2])]& … &E[(Z1-\operatorname{E}[Z1])(Zn-\operatorname{E}[Zn])]\ \\ E[(Z2-\operatorname{E}[Z2])(Z1-\operatorname{E}[Z1])]&E[(Z2-\operatorname{E}[Z2])(Z2-\operatorname{E}[Z2])]& … &E[(Z2-\operatorname{E}[Z2])(Zn-\operatorname{E}[Zn])]\ \\ \vdots&\vdots&\ddots&\vdots\ \\ E[(Zn-\operatorname{E}[Zn])(Z1-\operatorname{E}[Z1])]&E[(Zn-\operatorname{E}[Zn])(Z2-\operatorname{E}[Z2])]& … &E[(Zn-\operatorname{E}[Zn])(Zn-\operatorname{E}[Zn])] \end{bmatrix}
H | |
\operatorname{K} | |
ZZ |
=\operatorname{K}ZZ
The pseudo-covariance matrix is a symmetric matrix, i.e.
T | |
\operatorname{J} | |
ZZ |
=\operatorname{J}ZZ
The covariance matrix is a positive semidefinite matrix, i.e.
aH\operatorname{K}ZZa\ge0 foralla\inCn
By decomposing the random vector
Z
X=\Re{(Z)}
Y=\Im{(Z)}
Z=X+iY
(X,Y)
\begin{bmatrix}\operatorname{K}XX&\operatorname{K}YX\ \operatorname{K}XY&\operatorname{K}YY\end{bmatrix}
The matrices
\operatorname{K}ZZ
\operatorname{J}ZZ
X
Y
\begin{align} &\operatorname{K}XX=\operatorname{E}[(X-\operatorname{E}[X])(X-\operatorname{E}[X])T]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ+\operatorname{J}ZZ)\\ &\operatorname{K}YY=\operatorname{E}[(Y-\operatorname{E}[Y])(Y-\operatorname{E}[Y])T]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ-\operatorname{J}ZZ)\\ &\operatorname{K}YX=\operatorname{E}[(Y-\operatorname{E}[Y])(X-\operatorname{E}[X])T]=\tfrac{1}{2}\operatorname{Im}(\operatorname{J}ZZ+\operatorname{K}ZZ)\\ &\operatorname{K}XY=\operatorname{E}[(X-\operatorname{E}[X])(Y-\operatorname{E}[Y])T]=\tfrac{1}{2}\operatorname{Im}(\operatorname{J}ZZ-\operatorname{K}ZZ)\\ \end{align}
Conversely:
\begin{align} &\operatorname{K}ZZ=\operatorname{K}XX+\operatorname{K}YY+i(\operatorname{K}YX-\operatorname{K}XY)\\ &\operatorname{J}ZZ=\operatorname{K}XX-\operatorname{K}YY+i(\operatorname{K}YX+\operatorname{K}XY) \end{align}
The cross-covariance matrix between two complex random vectors
Z,W
\operatorname{K}ZW= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])\overline{(W1-\operatorname{E}[W1])}]&E[(Z1-\operatorname{E}[Z1])\overline{(W2-\operatorname{E}[W2])}]& … &E[(Z1-\operatorname{E}[Z1])\overline{(Wn-\operatorname{E}[Wn])}]\ \\ E[(Z2-\operatorname{E}[Z2])\overline{(W1-\operatorname{E}[W1])}]&E[(Z2-\operatorname{E}[Z2])\overline{(W2-\operatorname{E}[W2])}]& … &E[(Z2-\operatorname{E}[Z2])\overline{(Wn-\operatorname{E}[Wn])}]\ \\ \vdots&\vdots&\ddots&\vdots\ \\ E[(Zn-\operatorname{E}[Zn])\overline{(W1-\operatorname{E}[W1])}]&E[(Zn-\operatorname{E}[Zn])\overline{(W2-\operatorname{E}[W2])}]& … &E[(Zn-\operatorname{E}[Zn])\overline{(Wn-\operatorname{E}[Wn])}] \end{bmatrix}
And the pseudo-cross-covariance matrix is defined as:
\operatorname{J}ZW=\begin{bmatrix} E[(Z1-\operatorname{E}[Z1])(W1-\operatorname{E}[W1])]&E[(Z1-\operatorname{E}[Z1])(W2-\operatorname{E}[W2])]& … &E[(Z1-\operatorname{E}[Z1])(Wn-\operatorname{E}[Wn])]\ \\ E[(Z2-\operatorname{E}[Z2])(W1-\operatorname{E}[W1])]&E[(Z2-\operatorname{E}[Z2])(W2-\operatorname{E}[W2])]& … &E[(Z2-\operatorname{E}[Z2])(Wn-\operatorname{E}[Wn])]\ \\ \vdots&\vdots&\ddots&\vdots\ \\ E[(Zn-\operatorname{E}[Zn])(W1-\operatorname{E}[W1])]&E[(Zn-\operatorname{E}[Zn])(W2-\operatorname{E}[W2])]& … &E[(Zn-\operatorname{E}[Zn])(Wn-\operatorname{E}[Wn])] \end{bmatrix}
Two complex random vectors
Z
W
\operatorname{K}ZW=\operatorname{J}ZW=0
See main article: Independence (probability theory). Two complex random vectors
Z=(Z1,...,Z
T | |
m) |
W=(W1,...,W
T | |
n) |
where
FZ(z)
FW(w)
Z
W
FZ,W(z,w)
Z
W
Z\perp\perpW
Z
W
F | |
Z1,\ldots,Zm,W1,\ldots,Wn |
(z1,\ldots,zm,w1,\ldots,wn)=
F | |
Z1,\ldots,Zm |
(z1,\ldots,zm) ⋅
F | |
W1,\ldots,Wn |
(w1,\ldots,wn) forallz1,\ldots,zm,w1,\ldots,wn
A complex random vector
Z
\varphi\in[-\pi,\pi)
ei\varphiZ
Z
A complex random vector
Z
\operatorname{E}[Z]=0
\operatorname{var}[Z1]<infty,\ldots,\operatorname{var}[Zn]<infty
\operatorname{E}[ZZT]=0
Two complex random vectors
Z,W
(Z1,Z2,\ldots,Zm,W1,W2,\ldots,W
T | |
n) |
Z
c\inCn
cTZ
Z
n
A
m x n
AZ
\operatorname{K}ZW=0
The Cauchy-Schwarz inequality for complex random vectors is
\left|\operatorname{E}[ZHW]\right|2\leq\operatorname{E}[ZHZ]\operatorname{E}[|WHW|]
The characteristic function of a complex random vector
Z
n
Cn\toC
\varphiZ(\omega)=\operatorname{E}\left[
i\Re{(\omegaHZ) | |
e |