Bussgang theorem explained
In mathematics, the Bussgang theorem is a theorem of stochastic analysis. The theorem states that the cross-correlation between a Gaussian signal before and after it has passed through a nonlinear operation are equal to the signals auto-correlation up to a constant. It was first published by Julian J. Bussgang in 1952 while he was at the Massachusetts Institute of Technology.[1]
Statement
Let
be a zero-mean stationary
Gaussian random process and
\left\{Y(t)\right\}=g(X(t))
where
is a nonlinear amplitude distortion.
If
is the
autocorrelation function of
, then the
cross-correlation function of
and
is
where
is a constant that depends only on
.
It can be further shown that
}\int_^\infty ug(u)e^ \, du.
Derivation for One-bit Quantization
It is a property of the two-dimensional normal distribution that the joint density of
and
depends only on their covariance and is given explicitly by the expression
} e^
where
and
are standard Gaussian random variables with correlation
.
Assume that
, the correlation between
and
is,
} \int_^ \int_^ y_1 Q(y_2) e^ \, dy_1 dy_2 .
Since
y1
dy1=\rho\sqrt{2\pi(1-\rho2)}y2
,the correlation
may be simplified as
} \int_^ y_2 Q(y_2) e^ \, dy_2 .The integral above is seen to depend only on the distortion characteristic
and is independent of
.
Remembering that
, we observe that for a given distortion characteristic
, the ratio
is
} \int_^ y_2 Q(y_2) e^ \, dy_2.
Therefore, the correlation can be rewritten in the form
.
The above equation is the mathematical expression of the stated "Bussgang‘s theorem".
If
, or called one-bit quantization, then
} \int_^ y_2 e^ \, dy_2 = \sqrt.
[2] [3] [4]
Arcsine law
If the two random variables are both distorted, i.e.,
, the correlation of
and
is
.
When
, the expression becomes,
} \left[\int_{0}^{\infty} \int_{0}^{\infty} e^{-\alpha} \, dy_1 dy_2 + \int_{-\infty}^{0} \int_{-\infty}^{0} e^{-\alpha} \, dy_1 dy_2 - \int_{0}^{\infty} \int_{-\infty}^{0} e^{-\alpha} \, dy_1 dy_2 - \int_{-\infty}^{0} \int_{0}^{\infty} e^{-\alpha} \, dy_1 dy_2 \right]
where
.
Noticing that
,
and
e-\alphady1dy2=
e-\alphady1dy2
,
e-\alphady1dy2=
e-\alphady1dy2
,
we can simplify the expression of
as
} \int_^ \int_^ e^ \, dy_1 dy_2-1
Also, it is convenient to introduce the polar coordinate
. It is thus found that
.
Integration gives
} \int_^ \frac - 1= - \frac \arctan \left(\frac \right) \Bigg|_^ -1 =\frac \arcsin(\rho) ,
This is called "Arcsine law", which was first found by J. H. Van Vleck in 1943 and republished in 1966. The "Arcsine law" can also be proved in a simpler way by applying Price's Theorem.
[5] The function
can be approximated as
when
is small.
Price's Theorem
Given two jointly normal random variables
and
with joint probability function
}}}e^},
we form the mean
I(\rho)=E(g(y1,y2))=\int
g(y1,y2)p(y1,y2)dy1dy2
of some function
of
. If
as
, then
| \partialnI(\rho) |
\partial\rhon |
p(y1,y2)dy1dy2
=E\left(
\right)
.
Proof. The joint characteristic function of the random variables
and
is by definition the integral
\Phi(\omega1,\omega2)=\int
p(y1,y2)
dy1dy2=\exp\left\{-
| | 2 | | \omega | | ++2\rho\omega1\omega2 | | 1 | |
|
2 |
\right\}
.
From the two-dimensional inversion formula of Fourier transform, it follows that
p(y1,y2)=
\Phi(\omega1,
| -j(\omega1y1+\omega2y2) |
\omega | |
| 2)
e |
d\omega1
\exp\left\{-
| | 2 | | \omega | | ++2\rho\omega1\omega2 | | 1 | |
|
2 |
| -j(\omega1y1+\omega2y2) |
\right\}
e | |
d\omega1d\omega2
.
Therefore, plugging the expression of
into
, and differentiating with respect to
, we obtain
\begin{align}
| \partialnI(\rho) |
\partial\rhon |
&=
g(y1,y2)p(y1,y2)dy1dy2\\
&=
g(y1,y2)\left(
| \partialn\Phi(\omega1,\omega2) |
\partial\rhon |
d\omega1d\omega2\right)
dy1dy2\\
&=
g(y1,y2)\left(
\Phi(\omega1,
| -j(\omega1y1+\omega2y2) |
\omega | |
| 2)
e |
d\omega1d\omega2\right)
dy1dy2\\
&=
g(y1,y2)\left(
\Phi(\omega1,
d\omega1d\omega2\right)
dy1dy2\\
&
g(y1,y2)
dy1dy2\\
\end{align}
After repeated integration by parts and using the condition at
, we obtain the Price's theorem.
\begin{align}
| \partialnI(\rho) |
\partial\rhon |
&=
g(y1,y2)
dy1dy2\\
&=
| \partial2g(y1,y2) |
\partialy1\partialy2 |
dy1dy2
\\
&= …
p(y1,y2)dy1dy2
\end{align}
Proof of Arcsine law by Price's Theorem
If
g(y1,y2)=sign(y1)sign(y2)
, then
| \partial2g(y1,y2) |
\partialy1\partialy2 |
=4\delta(y1)\delta(y2)
where
is the Dirac delta function.
Substituting into Price's Theorem, we obtain,
| \partialE(sign(y1)sign(y2)) |
\partial\rho |
=
| \partialI(\rho) |
\partial\rho |
=
4\delta(y1)\delta(y2)p(y1,y2)dy1
}.
When
,
. Thus
E\left(sign(y1)sign(y2)\right)=I(\rho)=
} \, d\rho=\frac \arcsin(\rho),
which is Van Vleck's well-known result of "Arcsine law".
Application
This theorem implies that a simplified correlator can be designed. Instead of having to multiply two signals, the cross-correlation problem reduces to the gating of one signal with another.
Further reading
Notes and References
- J.J. Bussgang,"Cross-correlation function of amplitude-distorted Gaussian signals", Res. Lab. Elec., Mas. Inst. Technol., Cambridge MA, Tech. Rep. 216, March 1952.
- Vleck. J. H. Van. The Spectrum of Clipped Noise. Radio Research Laboratory Report of Harvard University. 51.
- Vleck. J. H. Van. Middleton. D.. January 1966. The spectrum of clipped noise. Proceedings of the IEEE. 54. 1. 2–19. 10.1109/PROC.1966.4567. 1558-2256.
- Price. R.. June 1958. A useful theorem for nonlinear devices having Gaussian inputs. IRE Transactions on Information Theory. 4. 2. 69–72. 10.1109/TIT.1958.1057444. 2168-2712.
- Book: Papoulis, Athanasios. Probability, Random Variables, and Stochastic Processes. McGraw-Hill. 2002. 0-07-366011-6. 396.