On Thu, Sep 8, 2011 at 6:47 PM, Guanbo Zheng <address@hidden>
I am currently using OFDM benchmark to generate OFDM signal under the setting of FFT len, CP length, occupied-tones and something.
But I can not find out what is the real bandwidth of signal it generated.
Because when I changed the Interpolation rate (sampling rate), the bandwidth at RX changed as well.
Ideally we know that setting enough large sampling rate ( In USRP2, the max fs = 25MHz), I should observe the constant signal with fixed BW.
It seems to me that BW of the generated signal is too large.
My question is: how to determine the BW of transmit signal in the codes? where I can change it.
All I found is actual bit rate = (converter_) / xrate / samples_per_symbol = 100MHz/4/2. But this one seems not related to the BW of signal itself.
Thanks for any suggestions!
The bandwidth of the signal changes with the interpolation rate. If you set the interpolation rate such that you get 25 MHz of bandwidth out, then the OFDM signal will also have a 25 MHz bandwidth. What you will _see_ over the air is 25e6 * (occupided_tones/fft_length), since the ratio of the used tones to the number of subcarriers is the amount of occupied bandwidth.
You can also think of it this way. The bandwidth of a subcarrier is BW/fft_length, where BW is the sample rate out of the USRP.