[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] OFDM with noise - lost packets

From: Mateusz
Subject: [Discuss-gnuradio] OFDM with noise - lost packets
Date: Sun, 20 Dec 2015 02:41:01 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.4.0

Hello, I have been evaluating the OFDM examples (at the beginning of my Bachelor Thesis) In ofdm_rx I have added noise source and file sinks to plot BER(SNR) in matlab. I added a Tag Debug on the output. The result is disappointing, I observe plenty of lost packets already at the SNR of about 13dB and below (noise amplitude about 1.5 in an unmodified ofdm_rx). So are those examples useful, with such a performance, to utilize with USRP and antennas? Which part of a synchronization process fails, that causes missing indexes of data packets (observed in a Tag Debug) in a noisy channel)? I think it's at the loop of 'header recognition', so it's the 'coarse frequency offset' that is too big for 'channel estimator'.

I mainly want to know if it's only possible to use ofdm_rx with a low noise, with SNR over about 13 dB.

Mateusz Loch

reply via email to

[Prev in Thread] Current Thread [Next in Thread]