Complex random variable explained

In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers. Complex random variables can always be considered as pairs of real random variables: their real and imaginary parts. Therefore, the distribution of one complex random variable may be interpreted as the joint distribution of two real random variables.

Some concepts of real random variables have a straightforward generalization to complex random variables—e.g., the definition of the mean of a complex random variable. Other concepts are unique to complex random variables.

Applications of complex random variables are found in digital signal processing,[1] quadrature amplitude modulation and information theory.

Definition

A complex random variable

Z

on the probability space

(\Omega,l{F},P)

is a function

Z\colon\OmegaC

such that both its real part

\Re{(Z)}

and its imaginary part

\Im{(Z)}

are real random variables on

(\Omega,l{F},P)

.

Examples

Simple example

Consider a random variable that may take only the three complex values

1+i,1-i,2

with probabilities as specified in the table. This is a simple example of a complex random variable.
Probability

P(z)

Value

z

1
4

1+i

1
4

1-i

1
2

2

The expectation of this random variable may be simply calculated:

\operatorname{E}[Z]=

1
4

(1+i)+

1
4

(1-i)+

1
2

2=

3
2

.

Uniform distribution

Another example of a complex random variable is the uniform distribution over the filled unit circle, i.e. the set

\{z\inC\mid|z|\le1\}

. This random variable is an example of a complex random variable for which the probability density function is defined. The density function is shown as the yellow disk and dark blue base in the following figure.

Complex normal distribution

See main article: Complex normal distribution.

Complex Gaussian random variables are often encountered in applications. They are a straightforward generalization of real Gaussian random variables. The following plot shows an example of the distribution of such a variable.

Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form

P(Z\leq1+3i)

make no sense. However expressions of the form

P(\Re{(Z)}\leq1,\Im{(Z)}\leq3)

make sense. Therefore, we define the cumulative distribution

FZ:C\to[0,1]

of a complex random variables via the joint distribution of their real and imaginary parts:

Probability density function

The probability density function of a complex random variable is defined as

fZ(z)=f\Re{(Z),\Im{(Z)}}(\Re{(z)},\Im{(z)})

, i.e. the value of the density function at a point

z\inC

is defined to be equal to the value of the joint density of the real and imaginary parts of the random variable evaluated at the point

(\Re{(z)},\Im{(z)})

.

An equivalent definition is given by

f
Z(z)=\partial2
\partialx\partialy

P(\Re{(Z)}\leqx,\Im{(Z)}\leqy)

where

x=\Re{(z)}

and

y=\Im{(z)}

.

As in the real case the density function may not exist.

Expectation

The expectation of a complex random variable is defined based on the definition of the expectation of a real random variable:[2]

Note that the expectation of a complex random variable does not exist if

\operatorname{E}[\Re{(Z)}]

or

\operatorname{E}[\Im{(Z)}]

does not exist.

If the complex random variable

Z

has a probability density function

fZ(z)

, then the expectation is given by

\operatorname{E}[Z]=\iintCzfZ(z)dxdy

.

If the complex random variable

Z

has a probability mass function

pZ(z)

, then the expectation is given by

\operatorname{E}[Z]=\sumzzpZ(z)

.
PropertiesWhenever the expectation of a complex random variable exists, taking the expectation and complex conjugation commute:

\overline{\operatorname{E}[Z]}=\operatorname{E}[\overlineZ].

The expected value operator

\operatorname{E}[]

is linear in the sense that

\operatorname{E}[aZ+bW]=a\operatorname{E}[Z]+b\operatorname{E}[W]

for any complex coefficients

a,b

even if

Z

and

W

are not independent.

Variance and pseudo-variance

The variance is defined in terms of absolute squares as:[2]

PropertiesThe variance is always a nonnegative real number. It is equal to the sum of the variances of the real and imaginary part of the complex random variable:

\operatorname{Var}[Z]=\operatorname{Var}[\Re{(Z)}]+\operatorname{Var}[\Im{(Z)}].

The variance of a linear combination of complex random variables may be calculated using the following formula:

N
\operatorname{Var}\left[\sum
k=1

akZk\right]=

N
\sum
i=1
N
\sum
j=1

ai\overline{aj

}\operatorname[Z_i,Z_j].

Pseudo-variance

The pseudo-variance is a special case of the pseudo-covariance and is defined in terms of ordinary complex squares, given by:

Unlike the variance of

Z

, which is always real and positive, the pseudo-variance of

Z

is in general complex.

Covariance matrix of real and imaginary parts

For a general complex random variable, the pair

(\Re{(Z)},\Im{(Z)})

has a covariance matrix of the form:

\begin{bmatrix}\operatorname{Var}[\Re{(Z)}]&\operatorname{Cov}[\Im{(Z)},\Re{(Z)}]\ \operatorname{Cov}[\Re{(Z)},\Im{(Z)}]&\operatorname{Var}[\Im{(Z)}]\end{bmatrix}

The matrix is symmetric, so

\operatorname{Cov}[\Re{(Z)},\Im{(Z)}]=\operatorname{Cov}[\Im{(Z)},\Re{(Z)}]

Its elements equal:

\begin{align} &\operatorname{Var}[\Re{(Z)}]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ+\operatorname{J}ZZ)\\ &\operatorname{Var}[\Im{(Z)}]=\tfrac{1}{2}\operatorname{Re}(\operatorname{K}ZZ-\operatorname{J}ZZ)\\ &\operatorname{Cov}[\Re{(Z)},\Im{(Z)}]=\tfrac{1}{2}\operatorname{Im}(\operatorname{J}ZZ)\\ \end{align}

Conversely:

\begin{align} &\operatorname{K}ZZ=\operatorname{Var}[\Re{(Z)}]+\operatorname{Var}[\Im{(Z)}]\\ &\operatorname{J}ZZ=\operatorname{Var}[\Re{(Z)}]-\operatorname{Var}[\Im{(Z)}]+i2\operatorname{Cov}[\Re{(Z)},\Im{(Z)}] \end{align}

Covariance and pseudo-covariance

The covariance between two complex random variables

Z,W

is defined as[2]

Notice the complex conjugation of the second factor in the definition.

In contrast to real random variables, we also define a pseudo-covariance (also called complementary variance):

The second order statistics are fully characterized by the covariance and the pseudo-covariance.

PropertiesThe covariance has the following properties:

\operatorname{Cov}[Z,W]=\overline{\operatorname{Cov}[W,Z]}

(Conjugate symmetry)

\operatorname{Cov}[\alphaZ,W]=\alpha\operatorname{Cov}[Z,W]

(Sesquilinearity)

\operatorname{Cov}[Z,\alphaW]=\overline{\alpha}\operatorname{Cov}[Z,W]

\operatorname{Cov}[Z1+Z2,W]=\operatorname{Cov}[Z1,W]+\operatorname{Cov}[Z2,W]

\operatorname{Cov}[Z,W1+W2]=\operatorname{Cov}[Z,W1]+\operatorname{Cov}[Z,W2]

\operatorname{Cov}[Z,Z]={\operatorname{Var}[Z]}

Z

and

W

are called uncorrelated if

\operatorname{K}ZW=\operatorname{J}ZW=0

(see also: uncorrelatedness (probability theory)).

Z

and

W

are called orthogonal if

\operatorname{E}[Z\overline{W}]=0

.

Circular symmetry

Circular symmetry of complex random variables is a common assumption used in the field of wireless communication. A typical example of a circular symmetric complex random variable is the complex Gaussian random variable with zero mean and zero pseudo-covariance matrix.

A complex random variable

Z

is circularly symmetric if, for any deterministic

\phi\in[-\pi,\pi]

, the distribution of

ei\phiZ

equals the distribution of

Z

.
PropertiesBy definition, a circularly symmetric complex random variable has \operatorname[Z] = \operatorname[e^{\mathrm i \phi} Z] = e^\operatorname[Z] for any

\phi

.

Thus the expectation of a circularly symmetric complex random variable can only be either zero or undefined.

Additionally, \operatorname[ZZ] = \operatorname[e^{\mathrm i \phi} Z e^{\mathrm i \phi}Z] = e^ \operatorname[ZZ] for any

\phi

.

Thus the pseudo-variance of a circularly symmetric complex random variable can only be zero.

If

Z

and

ei\phiZ

have the same distribution, the phase of

Z

must be uniformly distributed over

[-\pi,\pi]

and independent of the amplitude of

Z

.[3]

Proper complex random variables

The concept of proper random variables is unique to complex random variables, and has no correspondent concept with real random variables.

A complex random variable

Z

is called proper if the following three conditions are all satisfied:

\operatorname{E}[Z]=0

\operatorname{Var}[Z]<infty

\operatorname{E}[Z2]=0

This definition is equivalent to the following conditions. This means that a complex random variable is proper if, and only if:

\operatorname{E}[Z]=0

\operatorname{E}[\Re{(Z)}2]=\operatorname{E}[\Im{(Z)}2] ≠ infty

\operatorname{E}[\Re{(Z)}\Im{(Z)}]=0

For a proper complex random variable, the covariance matrix of the pair

(\Re{(Z)},\Im{(Z)})

has the following simple form:

\begin{bmatrix}

1
2

\operatorname{Var}[Z]&0\ 0&

1
2

\operatorname{Var}[Z]\end{bmatrix}

.I.e.:

\begin{align} &\operatorname{Var}[\Re{(Z)}]=\operatorname{Var}[\Im{(Z)}]=\tfrac{1}{2}\operatorname{Var}[Z]\\ &\operatorname{Cov}[\Re{(Z)},\Im{(Z)}]=0\\ \end{align}

Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random variables, which can be derived using the Triangle inequality and Hölder's inequality, is

\left|\operatorname{E}\left[Z\overline{W}\right]\right|2\leq\left|\operatorname{E}\left[\left|Z\overline{W}\right|\right]\right|2\leq\operatorname{E}\left[|Z|2\right]\operatorname{E}\left[|W|2\right]

.

Characteristic function

The characteristic function of a complex random variable is a function

C\toC

defined by

\varphiZ(\omega)=\operatorname{E}\left[ei\Re{(\overline{\omegaZ)}}\right]=\operatorname{E}\left[ei(\Re{(Z)}+\Im{(\omega)}\Im{(Z)})}\right].

See also

Notes and References

  1. Book: Lapidoth, A.. A Foundation in Digital Communication. Cambridge University Press . 2009 . 9780521193955.
  2. Book: Park,Kun Il. Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer . 2018 . 978-3-319-68074-3.
  3. Book: Peter J. Schreier, Louis L. Scharf. Statistical Signal Processing of Complex-Valued Data. Cambridge University Press . 2011 . 9780511815911.