The Elias Bassalygo bound is a mathematical limit used in coding theory for error correction during data transmission or communications.
Let
C
q
n
[q]n
R
C
\delta
Bq(y,\rhon)=\left\{x\in[q]n : \Delta(x,y)\leqslant\rhon\right\}
be the Hamming ball of radius
\rhon
y
Volq(y,\rhon)=|Bq(y,\rhon)|
\rhon
y.
|Bq(y,\rhon)|=|Bq(0,\rhon)|.
With large enough
n
R
\delta
R\leqslant1-Hq(Jq(\delta))+o(1),
where
Hq(x)\equivdef-xlogq\left({x\over{q-1}}\right)-(1-x)logq{(1-x)}
is the q-ary entropy function and
Jq(\delta)\equivdef\left(1-{1\overq}\right)\left(1-\sqrt{1-{q\delta\over{q-1}}}\right)
is a function related with Johnson bound.
To prove the Elias–Bassalygo bound, start with the following Lemma:
Lemma. For
C\subseteq[q]n
0\leqslante\leqslantn
e
|C|Volq(0,e) | |
qn |
codewords in it.
Proof of Lemma. Randomly pick a received word
y\in[q]n
Bq(y,0)
y
e
y
|Bq(y,e)\capC|
|C|Volq(y,e) | |
qn |
Since this is the expected value of the size, there must exist at least one
y
|Bq(y,e)\capC|\geqslant{{|C|Volq(y,e)}\over{qn}}={{|C|Volq(0,e)}\over{qn}},
otherwise the expectation must be smaller than this value.
Now we prove the Elias–Bassalygo bound. Define
e=nJq(\delta)-1.
B
B\geqslant{{|C|Vol(0,e)}\over{qn}}
By the Johnson bound, we have
B\leqslantqdn
|C|\leqslantqnd ⋅ {{qn}\over{Volq(0,e)}}\leqslant
n(1-Hq(Jq(\delta))+o(1)) | |
q |
The second inequality follows from lower bound on the volume of a Hamming ball:
Volq\left(0,\left\lfloor
d-1 | |
2 |
\right\rfloor\right)\geqslant
| ||||||||||
q |
.
Putting in
d=2e+1
\delta=\tfrac{d}{n}
Therefore we have
R={logq{|C|}\overn}\leqslant1-Hq(Jq(\delta))+o(1)