In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords. These are often used to recover messages sent over a noisy channel, such as a binary symmetric channel.
C\subset
n | |
F | |
2 |
n
x,y
n | |
F | |
2 |
d(x,y)
One may be given the message
x\in
n | |
F | |
2 |
y\inC
P(ysent\midxreceived)
For example, a person can choose the codeword
y
x
Each codeword does not have an expected possibility: there may be more than one codeword with an equal likelihood of mutating into the received message. In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention. Popular conventions include:
Given a received vector
x\in
n | |
F | |
2 |
y\inC
P(xreceived\midysent)
that is, the codeword
y
x
y
\begin{align} P(xreceived\midysent)&{}=
P(xreceived,ysent) | |
P(ysent) |
\\ &{}=P(ysent\midxreceived) ⋅
P(xreceived) | |
P(ysent) |
. \end{align}
Upon fixing
P(xreceived)
x
P(ysent)
P(xreceived\midysent)
y
P(ysent\midxreceived)
As with ideal observer decoding, a convention must be agreed to for non-unique decoding.
The maximum likelihood decoding problem can also be modeled as an integer programming problem.
The maximum likelihood decoding algorithm is an instance of the "marginalize a product function" problem which is solved by applying the generalized distributive law.
Given a received codeword
x\in
n | |
F | |
2 |
y\inC
d(x,y)=\#\{i:xi\not=yi\}
i.e. choose the codeword
y
x
p
d(x,y)=d,
then:
\begin{align} P(yreceived\midxsent)&{}=(1-p)n-d ⋅ pd\\ &{}=(1-p)n ⋅ \left(
p | |
1-p |
\right)d\\ \end{align}
which (since p is less than one half) is maximised by minimising d.
Minimum distance decoding is also known as nearest neighbour decoding. It can be assisted or automated by using a standard array. Minimum distance decoding is a reasonable decoding method when the following conditions are met:
p
These assumptions may be reasonable for transmissions over a binary symmetric channel. They may be unreasonable for other media, such as a DVD, where a single scratch on the disk can cause an error in many neighbouring symbols or codewords.
As with other decoding methods, a convention must be agreed to for non-unique decoding.
Syndrome decoding is a highly efficient method of decoding a linear code over a noisy channel, i.e. one on which errors are made. In essence, syndrome decoding is minimum distance decoding using a reduced lookup table. This is allowed by the linearity of the code.
Suppose that
C\subset
n | |
F | |
2 |
n
d
H
C
t=\left\lfloor
d-1 | |
2 |
\right\rfloor
errors made by the channel (since if no more than
t
Now suppose that a codeword
x\in
n | |
F | |
2 |
e\in
n | |
F | |
2 |
z=x+e
z
|C|
c\inC
d(c,z)\leqd(y,z)
for all
y\inC
Hx=0
for all
x\inC
z=x+e
Hz=H(x+e)=Hx+He=0+He=He
To perform ML decoding in a binary symmetric channel, one has to look-up a precomputed table of size
2n-k
He
e
Note that this is already of significantly less complexity than that of a standard array decoding.
However, under the assumption that no more than
t
He
t | |
\begin{matrix} \sum | |
i=0 |
\binom{n}{i}\\ \end{matrix}
See main article: List decoding.
This is a family of Las Vegas-probabilistic methods all based on the observation that it is easier to guess enough error-free positions, than it is to guess all the error-positions.
The simplest form is due to Prange: Let
G
k x n
C
k
G
G'
G
G'
c'
c=mG
C
m
m
m=c'G'-1
k
y
If
t
style\binom{n-t}{k}/\binom{n}{k}
This method has been improved in various ways, e.g. by Stern and Canteaut and Sendrier.
See main article: PRML.
Partial response maximum likelihood (PRML) is a method for converting the weak analog signal from the head of a magnetic disk or tape drive into a digital signal.
See main article: Viterbi decoder.
A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been encoded using forward error correction based on a convolutional code.The Hamming distance is used as a metric for hard decision Viterbi decoders. The squared Euclidean distance is used as a metric for soft decision decoders.
Optimal decision decoding algorithm (ODDA) for an asymmetric TWRC system.