Complex random vector explained

In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If

Z1,\ldots,Zn

are complex-valued random variables, then the n-tuple

\left(Z1,\ldots,Zn\right)

is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts.

Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.

Applications of complex random vectors are found in digital signal processing.

Definition

A complex random vector

Z=(Z1,\ldots,Z

T
n)
on the probability space

(\Omega,l{F},P)

is a function

Z\colon\OmegaCn

such that the vector

(\Re{(Z1)},\Im{(Z1)},\ldots,\Re{(Zn)},\Im{(Z

T
n)})
is a real random vector on

(\Omega,l{F},P)

where

\Re{(z)}

denotes the real part of

z

and

\Im{(z)}

denotes the imaginary part of

z

.[1]

Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form

P(Z\leq1+3i)

make no sense. However expressions of the form

P(\Re{(Z)}\leq1,\Im{(Z)}\leq3)

make sense. Therefore, the cumulative distribution function

FZ:Cn\mapsto[0,1]

of a random vector

Z=(Z1,...,Z

T
n)
is defined as

where

z=(z1,...,z

T
n)
.

Expectation

As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.

Covariance matrix and pseudo-covariance matrix

The covariance matrix (also called second central moment)

\operatorname{K}ZZ

contains the covariances between all pairs of components. The covariance matrix of an

n x 1

random vector is an

n x n

matrix whose

(i,j)

th element is the covariance between the i th and the j th random variables.[2] Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.

\operatorname{K}ZZ= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Z1-\operatorname{E}[Z1])\overline{(Z2-\operatorname{E}[Z2])}]&&E[(Z1-\operatorname{E}[Z1])\overline{(Zn-\operatorname{E}[Zn])}]\\\ E[(Z2-\operatorname{E}[Z2])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Z2-\operatorname{E}[Z2])\overline{(Z2-\operatorname{E}[Z2])}]&&E[(Z2-\operatorname{E}[Z2])\overline{(Zn-\operatorname{E}[Zn])}]\\\ \vdots&\vdots&\ddots&\vdots\\\ E[(Zn-\operatorname{E}[Zn])\overline{(Z1-\operatorname{E}[Z1])}]&E[(Zn-\operatorname{E}[Zn])\overline{(Z2-\operatorname{E}[Z2])}]&&E[(Zn-\operatorname{E}[Zn])\overline{(Zn-\operatorname{E}[Zn])}] \end{bmatrix}

The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.

\operatorname{J}ZZ= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])(Z1-\operatorname{E}[Z1])]&E[(Z1-\operatorname{E}[Z1])(Z2-\operatorname{E}[Z2])]&&E[(Z1-\operatorname{E}[Z1])(Zn-\operatorname{E}[Zn])]\\\ E[(Z2-\operatorname{E}[Z2])(Z1-\operatorname{E}[Z1])]&E[(Z2-\operatorname{E}[Z2])(Z2-\operatorname{E}[Z2])]&&E[(Z2-\operatorname{E}[Z2])(Zn-\operatorname{E}[Zn])]\\\ \vdots&\vdots&\ddots&\vdots\\\ E[(Zn-\operatorname{E}[Zn])(Z1-\operatorname{E}[Z1])]&E[(Zn-\operatorname{E}[Zn])(Z2-\operatorname{E}[Z2])]&&E[(Zn-\operatorname{E}[Zn])(Zn-\operatorname{E}[Zn])] \end{bmatrix}

PropertiesThe covariance matrix is a hermitian matrix, i.e.[1]
H
\operatorname{K}
ZZ

=\operatorname{K}ZZ

.

The pseudo-covariance matrix is a symmetric matrix, i.e.

T
\operatorname{J}
ZZ

=\operatorname{J}ZZ

.

The covariance matrix is a positive semidefinite matrix, i.e.

aH\operatorname{K}ZZa\ge0foralla\inCn

.

Covariance matrices of real and imaginary parts

By decomposing the random vector

Z

into its real part

X=\Re{(Z)}

and imaginary part

Y=\Im{(Z)}

(i.e.

Z=X+iY

), the pair

(X,Y)

has a covariance matrix of the form:

\begin{bmatrix}\operatorname{K}XX&\operatorname{K}YX\ \operatorname{K}XY&\operatorname{K}YY\end{bmatrix}

The matrices

\operatorname{K}ZZ

and

\operatorname{J}ZZ

can be related to the covariance matrices of

X

and

Y

via the following expressions:

\begin{align} &\operatorname{K}XX=\operatorname{E}[(X-\operatorname{E}[X])(X-\operatorname{E}[X])T]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ+\operatorname{J}ZZ)\\ &\operatorname{K}YY=\operatorname{E}[(Y-\operatorname{E}[Y])(Y-\operatorname{E}[Y])T]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ-\operatorname{J}ZZ)\\ &\operatorname{K}YX=\operatorname{E}[(Y-\operatorname{E}[Y])(X-\operatorname{E}[X])T]=\tfrac{1}{2}\operatorname{Im}(\operatorname{J}ZZ+\operatorname{K}ZZ)\\ &\operatorname{K}XY=\operatorname{E}[(X-\operatorname{E}[X])(Y-\operatorname{E}[Y])T]=\tfrac{1}{2}\operatorname{Im}(\operatorname{J}ZZ-\operatorname{K}ZZ)\\ \end{align}

Conversely:

\begin{align} &\operatorname{K}ZZ=\operatorname{K}XX+\operatorname{K}YY+i(\operatorname{K}YX-\operatorname{K}XY)\\ &\operatorname{J}ZZ=\operatorname{K}XX-\operatorname{K}YY+i(\operatorname{K}YX+\operatorname{K}XY) \end{align}

Cross-covariance matrix and pseudo-cross-covariance matrix

The cross-covariance matrix between two complex random vectors

Z,W

is defined as:

\operatorname{K}ZW= \begin{bmatrix} E[(Z1-\operatorname{E}[Z1])\overline{(W1-\operatorname{E}[W1])}]&E[(Z1-\operatorname{E}[Z1])\overline{(W2-\operatorname{E}[W2])}]&&E[(Z1-\operatorname{E}[Z1])\overline{(Wn-\operatorname{E}[Wn])}]\\\ E[(Z2-\operatorname{E}[Z2])\overline{(W1-\operatorname{E}[W1])}]&E[(Z2-\operatorname{E}[Z2])\overline{(W2-\operatorname{E}[W2])}]&&E[(Z2-\operatorname{E}[Z2])\overline{(Wn-\operatorname{E}[Wn])}]\\\ \vdots&\vdots&\ddots&\vdots\\\ E[(Zn-\operatorname{E}[Zn])\overline{(W1-\operatorname{E}[W1])}]&E[(Zn-\operatorname{E}[Zn])\overline{(W2-\operatorname{E}[W2])}]&&E[(Zn-\operatorname{E}[Zn])\overline{(Wn-\operatorname{E}[Wn])}] \end{bmatrix}

And the pseudo-cross-covariance matrix is defined as:

\operatorname{J}ZW=\begin{bmatrix} E[(Z1-\operatorname{E}[Z1])(W1-\operatorname{E}[W1])]&E[(Z1-\operatorname{E}[Z1])(W2-\operatorname{E}[W2])]&&E[(Z1-\operatorname{E}[Z1])(Wn-\operatorname{E}[Wn])]\\\ E[(Z2-\operatorname{E}[Z2])(W1-\operatorname{E}[W1])]&E[(Z2-\operatorname{E}[Z2])(W2-\operatorname{E}[W2])]&&E[(Z2-\operatorname{E}[Z2])(Wn-\operatorname{E}[Wn])]\\\ \vdots&\vdots&\ddots&\vdots\\\ E[(Zn-\operatorname{E}[Zn])(W1-\operatorname{E}[W1])]&E[(Zn-\operatorname{E}[Zn])(W2-\operatorname{E}[W2])]&&E[(Zn-\operatorname{E}[Zn])(Wn-\operatorname{E}[Wn])] \end{bmatrix}

Two complex random vectors

Z

and

W

are called uncorrelated if

\operatorname{K}ZW=\operatorname{J}ZW=0

.

Independence

See main article: Independence (probability theory). Two complex random vectors

Z=(Z1,...,Z

T
m)
and

W=(W1,...,W

T
n)
are called independent if

where

FZ(z)

and

FW(w)

denote the cumulative distribution functions of

Z

and

W

as defined in and

FZ,W(z,w)

denotes their joint cumulative distribution function. Independence of

Z

and

W

is often denoted by

Z\perp\perpW

.Written component-wise,

Z

and

W

are called independent if
F
Z1,\ldots,Zm,W1,\ldots,Wn

(z1,\ldots,zm,w1,\ldots,wn)=

F
Z1,\ldots,Zm

(z1,\ldots,zm)

F
W1,\ldots,Wn

(w1,\ldots,wn)forallz1,\ldots,zm,w1,\ldots,wn

.

Circular symmetry

A complex random vector

Z

is called circularly symmetric if for every deterministic

\varphi\in[-\pi,\pi)

the distribution of

ei\varphiZ

equals the distribution of

Z

.[3]
Properties

Proper complex random vectors

A complex random vector

Z

is called proper if the following three conditions are all satisfied:

\operatorname{E}[Z]=0

(zero mean)

\operatorname{var}[Z1]<infty,\ldots,\operatorname{var}[Zn]<infty

(all components have finite variance)

\operatorname{E}[ZZT]=0

Two complex random vectors

Z,W

are called jointly proper is the composite random vector

(Z1,Z2,\ldots,Zm,W1,W2,\ldots,W

T
n)
is proper.
Properties

Z

is proper if, and only if, for all (deterministic) vectors

c\inCn

the complex random variable

cTZ

is proper.

Z

is a proper random vectors with

n

components and

A

is a deterministic

m x n

matrix, then the complex random vector

AZ

is also proper.

\operatorname{K}ZW=0

.

Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random vectors is

\left|\operatorname{E}[ZHW]\right|2\leq\operatorname{E}[ZHZ]\operatorname{E}[|WHW|]

.

Characteristic function

The characteristic function of a complex random vector

Z

with

n

components is a function

Cn\toC

defined by:

\varphiZ(\omega)=\operatorname{E}\left[

i\Re{(\omegaHZ)
e
} \right ] = \operatorname \left [e^{i(\Re{(\omega_1)}\Re{(Z_1)} + \Im{(\omega_1)}\Im{(Z_1)} + \cdots + \Re{(\omega_n)}\Re{(Z_n)} + \Im{(\omega_n)}\Im{(Z_n)})} \right ]

See also

Notes and References

  1. Book: Lapidoth, Amos . 2009 . A Foundation in Digital Communication . Cambridge University Press . 978-0-521-19395-5.
  2. Book: Gubner, John A. . 2006 . Probability and Random Processes for Electrical and Computer Engineers . Cambridge University Press . 978-0-521-86470-1.
  3. Book: Tse, David . 2005 . Fundamentals of Wireless Communication . Cambridge University Press.