[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Discuss-gnuradio] RMS value of a signal changes with PAPR
From: |
Tomaž Šolc |
Subject: |
[Discuss-gnuradio] RMS value of a signal changes with PAPR |
Date: |
Mon, 13 Apr 2015 17:18:14 +0200 |
User-agent: |
Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Icedove/31.6.0 |
Dear all,
I'm calculating the RMS value of a signal with the following setup
RF vector signal generator -> USRP/rtl-sdr/... -> "RMS" block in GRC.
Please bear with me - I'm not interested in the exact (absolute) voltage
level.
What I can't explain is why the calculated RMS value consistently shows
a higher value when the signal is modulated (e.g. has a higher
peak/average power ratio) compared to CW (unmodulated sine wave)
For instance, RMS value shown in GRC for band-limited Gaussian noise is
always around 2.5 dB _higher_ than RMS of CW of equivalent power
(equivalent power according to the generator level setting and a
spectrum analyzer with a power meter function). Similarly, 100% AM
modulated signal shows around 1.3 dB _higher_ RMS.
This effect appears with a USRP, rtl-sdr dongle as well as some custom
hardware, so it doesn't seem to be something device-specific. Also, all
hardware effects on the receiver side I can imagine result in _lower_
gain for modulated signals. If anything, I would expect to see a _lower_
RMS when turning on the modulation.
Some more details are in this blog post:
https://www.tablix.org/~avian/blog/archives/2015/04/signal_power_in_gnu_radio/
Any ideas would be welcome. At this point I have a feeling I'm missing
something obvious here...
Thanks,
Tomaž
- [Discuss-gnuradio] RMS value of a signal changes with PAPR,
Tomaž Šolc <=