|Subject:||Re: [Discuss-gnuradio] OFDM benchmark optimal parameter|
|Date:||Mon, 21 Mar 2016 12:01:13 +0100|
|User-agent:||Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.6.0|
I'd encourage you to either fix the Bit Error Rate block or write
something that does your job. In fact, the unmodified ofdm_loopback
example doesn't work as BER test, because all packets are identical,
and if a packet has errors, the OFDM receiver will drop it, so you'd
never see an error. |
Open rx_ofdm.grc ; it is a very similar example, but instead of having the black box "OFDM Receiver", you see how the OFDM receiver internally works.
Play with the channel model; e.g. set the noise voltage really high (1.0) and the frequency offset to e.g. 2.0/fft_len. You'll see a lot of
INFO: Detected and invalid packet at item ....
Now, change these parameters.
Your ratio of valid packets and invalid packets gives you a packet error rate.
On 21.03.2016 11:47, Diyar Muhammed wrote:
|[Prev in Thread]||Current Thread||[Next in Thread]|