discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] usrp_* gains on Mac OS X much greater than on Linux


From: Jonathan Jacky
Subject: [Discuss-gnuradio] usrp_* gains on Mac OS X much greater than on Linux
Date: Wed, 21 Dec 2005 14:26:32 -0800 (PST)


I've observed something weird running some USRP programs on Mac OS X:
the gains in effect in usrp_siggen, usrp_oscope and usrp_fft seem
unreasonably large.  I only get reasonable looking output (from
usrp_siggen) and input (with usrp_oscope and usrp_fft) with -a100 or
less, otherwise the output (or input) saturates or wraps around in a
very bizarre way.

It's as if -a100 were almost full scale when these programs run on Mac
OS X.  Running the same programs on the same USRP attached to a PC
running Linux, -a32767 is full scale.  It's as if the gain were about
200x greater on the Mac.

This is with USRP Rev 2.0 and the Standard RX and TX daughterboards,
and GNU Radio checked out from CVS on Nov. 28 2005.

One possibly important detail: I have to run the programs with high
interpolation and decimations (128 or 256) because the USRP+USB on the
Mac can only manage 4 MBytes/sec.

I have only seen this effect in the Python programs under
gnuradio-examples/python/usrp, not the C programs under
usrp/host/apps.  When I run test_usrp_standard_tx (a C program) on the
Mac with these arguments: test_usrp_standard_tx -I 128 -a 10000 A
scope (a real one, not usrp_oscope) connected to TX-A shows a 15 KHz
sine wave with a peak-to-peak amplitude of about 80 mV.  -I 128 is the
smallest interpolation that does not result in underruns on the Mac.
The relatively low amplitude shows the effect of the AC coupling at
this low frequency.  The output of this command is similar on a PC
running Linux (with GNU Radio 2.6 from the tarballs)

Next I run usrp_siggen with -f15e3 to match the output frequency of
test_usrp_standard_tx.  On a PC running Linux
  usrp_siggen -I 256 -f15e3 -a 16000 (this is the default amplitude)
results in about a ~125 mV peak to peak sine wave.

On the Mac, this same command does not produce a sinewave at all, but
a complicated waveform that looks amplitude-modulated.  A clean sine
wave appears only at small values for -a, for example
 usrp_siggen -I 256 -f15e3 -a75
results in about a ~130 mV peak-to-peak sinewave - about the same
amplitude as with -a 16000 on Linux!

There is a similar gain effect on input.  Using usrp_oscope on the Mac,
with this command
 usrp_oscope -g0 -f0 -d128
applying a 20 KHz input signal of only 20 mV peak-to-peak amplitude
(from a signal generator, not from usrp_siggen) on RX-A results in a
full-scale trace on the scope display, ranging over -30000..30000.
Increasing the amplitude of the input signal results in the scope
trace wrapping around (not just clipping).  This is with -g0, the
minimum gain; the scope panel shows Gain: 0.  But on the PC running
Linux, with the same command, a 500 mV (not 20 mV) peak-to-peak input
signal causes the scope trace to range over only about -2400..2400.
So again it looks like the gain on the Mac is about 200x the gain on
Linux.

I wonder if the versions of these programs built on the Mac are
somehow setting the scale in the USRP's AD chip in some occult
fashion.

Or, I wonder if there is some interaction of interpolation/decimation and gain.
An older Linux version of usrp_oscope (checked out from CVS on Oct 7) issued
this message when invoked with -d 256:
 usrp_standard_rx::set_decim_rate: WARNING rates > 128 result in incorrect gain
The current version does not issue this message, but it suggests there
might be something tricky going on.  (But programs running on Linux at
large -i and -d do not show unusual gains).

Has anyone else seen this?  Can anyone suggest an explanation?

Jon Jacky




reply via email to

[Prev in Thread] Current Thread [Next in Thread]