|Date:||Tue, 22 Jun 2010 15:44:05 +0200|
I'm new in this world... but I'm trying to understand the program usrp_spectrum_sense.py.
want to know how many samples my usrp capture at a given time period.
It should theoretically capture in n seconds----------> USRP_RATE x n seconds.
in 10 seconds => 4 MS/s
x 10s = 40M Samples
in 60 seconds => 4 MS/s x 60s = 240M Samples
in 120 seconds => 4 MS/s x 120s = 480M Samples
I use the default values.
But in my case when I lunch the program with an appropriate counter in the usrp_source_base.cc file , I check the number of samples captured and in nine different experiments they are:
in 10 seconds => 7.18193 e7 Samples
in 60 seconds => 3.60591 e8 Samples
in 120 seconds => 1.25783 e9 Samples
in 10 seconds => 8.82063 e7 Samples
in 60 seconds => 3.4864 e8 Samples
in 120seconds => 1.22311 e9 Samples
in 10 seconds => 5.97432 e7 Samples
in 60 seconds => 4.15018 e8 Samples
in 120 seconds => 7.2589 e8 Samples
How is possible that these values don't correspond to the theoretical values?? And why is the real USRP_RATE different?
|[Prev in Thread]||Current Thread||[Next in Thread]|