In coding theory, generalized minimum-distance (GMD) decoding provides an efficient algorithm for decoding concatenated codes, which is based on using an errors-and-erasures decoder for the outer code.
A naive decoding algorithm for concatenated codes can not be an optimal way of decoding because it does not take into account the information that maximum likelihood decoding (MLD) gives. In other words, in the naive algorithm, inner received codewords are treated the same regardless of the difference between their hamming distances. Intuitively, the outer decoder should place higher confidence in symbols whose inner encodings are close to the received word. David Forney in 1966 devised a better algorithm called generalized minimum distance (GMD) decoding which makes use of those information better. This method is achieved by measuring confidence of each received codeword, and erasing symbols whose confidence is below a desired value. And GMD decoding algorithm was one of the first examples of soft-decision decoders. We will present three versions of the GMD decoding algorithm. The first two will be randomized algorithms while the last one will be a deterministic algorithm.
u,v\in\Sigman
u
v
\Delta(u,v)
u
v
C\subseteq\Sigman
C
d=min\Delta(c1,c2)
c1\nec2\inC
m=(m1, … ,mK)\in[Q]K
Cout=[Q]K\to[Q]N, Cin:[q]k\to[q]n,
and their distances are
D
d
Cout\circCin(m)=(Cin(Cout(m)1),\ldots,Cin(Cout(m)N))
Cout(m)=((Cout(m)1,\ldots,(m)N)).
Cout
K=O(logN)
N
DMLD:\Sigman\toC
y\in\Sigman,DMLD(y)=\argminc\Delta(c,y)
\Pr
S
S
\Pr[A]\ge0
A,\Pr[S]=1
\Pr[A\cupB]=\Pr[A]+\Pr[B]
A
B
X
E[X]=\sumx\Pr[X=x].
Consider the received word
y=(y1,\ldots,yN)\in[qn]N
Cout
Randomized_Decoder
Given :
y=(y1,...,yN)\in[qn]N
1\lei\leN
yi'=
MLD | |
Cin |
(yi)
\omegai=min(\Delta(Cin(yi'),yi),\tfrac{d}{2})
1\lei\leN
2\omegai\overd
yi''\leftarrow?,
yi''=yi'
Cout
y''=(y1'',\ldots,yN'')
Theorem 1. Let y be a received word such that there exists a codeword
c=(c1, … ,cN)\inCout\circ{Cin
\Delta(c,y)<\tfrac{Dd}{2}
c
Note that a naive decoding algorithm for concatenated codes can correct up to
\tfrac{Dd}{4}
Lemma 1. Let the assumption in Theorem 1 hold. And if
y''
e'
s'
c
E[2e'+s']<D.
Remark. If
2e'+s'<D
c
Proof of lemma 1. For every
1\lei\leN,
ei=\Delta(yi,ci).
Next for every
1\lei\leN
We claim that we are done if we can show that for every
1\lei\leN
Clearly, by definition
Further, by the linearity of expectation, we get
To prove (2) we consider two cases:
i
i
Case 1:
(ci=Cin(yi'))
Note that if
yi''=?
e | |
X | |
i |
=0
\Pr[yi''=?]=\tfrac{2\omegai}{d}
?] | |
E[X | |
i |
=
? | |
\Pr[X | |
i |
=1]=\tfrac{2\omegai}{d},
e] | |
E[X | |
i |
=
e | |
\Pr[X | |
i |
=1]=0
Further, by definition we have
Case 2:
(ci\neCin(yi'))
In this case,
?] | |
E[X | |
i |
=\tfrac{2\omegai}{d}
e] | |
E[X | |
i |
=
e | |
\Pr[X | |
i |
=1]=1-\tfrac{2\omegai}{d}.
Since
ci\neCin(yi'),ei+\omegai\geqslantd
(\omegai=\Delta(Cin(yi'),yi)<\tfrac{d}{2})
Finally, this implies
In the following sections, we will finally show that the deterministic version of the algorithm above can do unique decoding of
Cout\circCin
Note that, in the previous version of the GMD algorithm in step "3", we do not really need to use "fresh" randomness for each
i
i
Modified_Randomized_Decoder
Given :
y=(y1,\ldots,yN)\in[qn]N
\theta\in[0,1]
1\lei\leN
yi'=
MLD | |
Cin |
(yi)
\omegai=min(\Delta(Cin(yi'),yi),{d\over2})
\theta<\tfrac{2\omegai}{d}
yi''\leftarrow?,
yi''=yi'
Cout
y''=(y1'',\ldots,yN'')
For the proof of Lemma 1, we only use the randomness to show that
In this version of the GMD algorithm, we note that
The second equality above follows from the choice of
\theta
E[2e'+s']<D
\theta
[0,1]
Let
Q=\{0,1\}\cup\{{2\omega1\overd},\ldots,{2\omegaN\overd}\}
i,\omegai=
min(\Delta(yi', |
yi), |
{d\over2})
where
q1< … <qm
m\le\left\lfloor
d | |
2 |
\right\rfloor
\theta\in[qi,qi+1]
y''.
\theta\inQ
Deterministic_Decoder
Given :
y=(y1,\ldots,yN)\in[qn]N
\theta\inQ
yi'=
MLD | |
Cin |
(yi)
1\lei\leN
\omegai=min(\Delta(Cin(yi'),yi),{d\over2})
1\lei\leN
\theta<{2\omegai\overd}
yi''\leftarrow?,
yi''=yi'
Cout
y''=(y1'',\ldots,yN'')
c\theta
Cout\circCin
c\theta
y
Every loop of 1~4 can be run in polynomial time, the algorithm above can also be computed in polynomial time. Specifically, each call to an errors and erasures decoder of
<dD/2
O(d)
O(NQnO(1)+NTout)
Tout