discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] fast parallel filtering


From: Dirk Gorissen
Subject: [Discuss-gnuradio] fast parallel filtering
Date: Sat, 11 Mar 2017 18:28:28 +0000

Hello all,

Given a stream of samples I would like to apply n slightly different
filters to it with n being able to be chosen at runtime. Then combine
the results back to a single stream.

As a test I built a flowgraph with the following chains in parallel for n = 6

                | ->  decimating fir filter 1 -> complex to mag ->    |
stream -> | ->  decimating fir filter 2 -> complex to mag ->    |  -> Max -> ...
                |                                  ....
                        |
                | ->  decimating fir filter n -> complex to mag ->    |

So the same stream is sent to each chain (decimation is 1) and the
output of each chain is pushed through a big Max block with 6 inputs.

This works but not particularly elegant and a bit annoying to change
if I suddenly decide I want to change n. In particular it also does
not seem computationally efficient.

What I would like is to replace the above by a single block that

- replicates the input n times
- applies each filter on each replica
- combines the output again to a single stream
- have a tunable n parameter
- is fast

I did this with an Embedded python block doing essentially this:

for i in range(n):
     out[i] = scipy.signal.lfilter(taps[i], 1, input)

This is using exactly the same taps as in the chain case. This works
but the output is different and worse than what I get with the
separate chains. As a test, instead of lfilter I tried:

gnuradio.filter.fir_filter_ccc(1,taps[i]).work(input[0],output)

Thinking perhaps that is a closer replica. But couldnt get it to work..

I suspect there should be an easy / natural way of doing this in
gnuradio. Looked at the filter bank / channelliser blocks but failed
to get anywhere.

So what is the best way forward to do this?

Many thanks
Dirk



reply via email to

[Prev in Thread] Current Thread [Next in Thread]