discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Soft-Decoding Binary Golay (was: [USRP-users] GPP requirements for USRP


From: Marcus Müller
Subject: Soft-Decoding Binary Golay (was: [USRP-users] GPP requirements for USRP B210 amsat)
Date: Mon, 24 May 2021 00:37:37 +0200
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Thunderbird/78.10.1

Hi Daniel,

On 23.05.21 10:30, Daniel Estévez wrote:

> <snip>

> That's an interesting idea to decode Golay (23,12). The results would be 
> equivalent to
> algebraic decoding, I believe. This is because the balls of Hamming radius 3 
> centred on
> the codewords are a partition of the total code space. Algebraic decoding 
> solves for the
> centre of the ball that the received word belongs to. Your lookup table would 
> be a table
> of just that: each possible word, together with its corresponding centre.

I think you're right, for perfect codes under hard decision algebraic should be 
ML
decoding. "The closest" is the same as "codeword at center of your sphere", and 
there's
nothing outside any spheres.

> I wonder which algorithm is faster.

I do, too!

> The lookup table seems like instant decode, but since the table is large-ish, 
> it
> wouldn't fit in cache and the entry would need to be fetched from RAM.

Indeed. If one was to do this at very high rates with CPU caches to spare, one 
could
partition the table and let different cores have a different part of the table, 
to
maximize cache usage. Then, of course, all but one core would idle while that 
one core
gets the result from L2. That sounds rather inefficient, indeed. It might be 
something
that works well on a GPU (partition table into the number of stream processors 
you have,
get result instantly from local memory).

> In comparison, algebraic decoding takes a handful arithmetic operations and 
> that's all.
> So algebraic decoding might be faster, after all.
Yeah, that sounds very true; I simply have no intuition for the complexity of 
it.
> However, your idea about maximum-likelyhood decoding makes me think what 
> would happen if
> we try to do decoding from soft symbols. 
Have to think about that for a moment. (In my head: what received words would be
incorrectly decoded by HD but could be correctly decoded using SD?)
> That's a much more difficult problem. I haven't given to it any serious 
> thought, but I
> think a naïve approach is computationally intractable. A quick Google search 
> shows some
> papers doing different tricks to speed this up.

Hm, iterative decoders (Turbo, Tanner Graph/Belief Prop/message passing) would 
probably
heavily suffer from the very short cycles in the non-systematic part of the 
generator
matrix. Might still be worth a try, especially since these should "unroll" 
pretty nicely -
small size, low number of iterations that could ever do any good.

But it'll probably break down to classical BCJR on linear block codes[20, 
section IV],
which should be pretty doable in complexity (says McEliece [21]), and the edge 
count is at
most $2^{n-k}=2^{12}$ in our case, distributed over a depth of 12. Sounds like a
manageable Trellis for BCJR. Still, way more work than HD.

> In any case, for the AX100 modem I think that this would be an overkill. The 
> Golay code
> is used to protect the packet header, which basically contains the packet 
> size. The
> payload itself is a Reed-Solomon codeword. So the Golay code is not the 
> "weakest link in
> the chain". It is more likely that the Reed-Solomon codeword can't be decoded 
> in the
> presence of errors.

:) That's certainly true; I mean, after all, this takes a BER > 1/8 within the 
header to
go wrong. I guess the payload is classical RS algebraic HD decoder?

If we wanted to go SD on that: I must plainly admit I've got no idea how to SD 
decode an
RS code, and the green book sadly only deals in techniques that are 
well-established :D.

We can certainly build a Tanner Graph and throw message passing at the problem, 
but that's
not going to be fun, because

1. the parity-check matrix of an RS code contains a column of 1 and basically 
all the
roots of the code, and thus is not low-density at all
2. each node needs to hold 2^q LLRs (8 here), not just one (or we vastly 
increase the size
of the tanner graph, don't know), and
3. no experience in doing that whatsoever.

Might still work. Probably would have to look into the DLR-style (at least it's 
a footnote
in a DLR presentation that pointed me to these, but I'm anything but an expert 
on these!)
multi-bases BP methods[22].

If it's not going to be a Message-Passing decoder: Don't know, BCJR over all 
the parity
symbols (255-223=32?) with 8 possible state transitions per symbol sounds like 
no fun,
either.

So, we're left with ordered-statistics methods ([23], from same DLR guest 
lecture: I'm
really a n00b.); not quite how many rounds of hard decode – evaluate – redecode 
steps we
need, but since 1995, sorting values and the Gauss algorithm on matrices with 
less than a
couple hundred rows have gotten rather cheap, so: This would be my prime 
candidate.
Personally, I think the method in [23 Sec. II,IV] is pretty. This might 
influence my
assessment.

>
> PS: Please keep me in CC, since I'm not on the mailing list.

Sure thing! I think we should moving this to discuss-gnuradio, as it leaves the 
realm of
using USRPs pretty robustly, even if that's worse for Martin (sorry!), who I 
believe is
not on gr-discuss (but might also not be that much into block codes). So, let's 
do that.

Best regards,

Marcus


[20] L. R. Bahl, J. Cocke, F. Jelinek, and J. Raviv, “Optimal decoding of 
linear codes for
minimizing symbol error rate,” IEEE Trans. Inform. Theory, vol. IT-20, pp. 
284-287, 1974.
This paper is far too short for its own good.
https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.331.2699&rep=rep1&type=pdf
[21] McEliece, Robert J. "On the BCJR trellis for linear block codes." /IEEE 
Transactions
on Information Theory/ 42.4 (1996): 1072-1092. 
https://core.ac.uk/download/pdf/4875141.pdf
[22] Hehn, Thorsten, et al. "Multiple-bases belief-propagation with leaking for 
decoding
of moderate-length block codes." /7th International ITG Conference on Source 
and Channel
Coding/. VDE, 2008.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.587.741&rep=rep1&type=pdf'
[23] M.P.C. Fossorier, S. Lin, Soft-decision decoding of linear block codes 
based on
ordered statistics, IEEE Trans. Information Theory, 1995
https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.453.589&rep=rep1&type=pdf




reply via email to

[Prev in Thread] Current Thread [Next in Thread]