discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] Data lost whe using big file sources


From: Bogdan Diaconescu
Subject: [Discuss-gnuradio] Data lost whe using big file sources
Date: Tue, 10 Apr 2012 03:48:11 -0700 (PDT)

Hello gnuradio fellows,

I have an issue that appears in all gnuradio versions I used lately (I started 
with 3.3 and last week I updated to latest from git) and I thought I should 
post here before allocating the time to look into it by myself.

I'm modifying a gnuradio block that is connected from python to 6 file sources. 
Everything works fine as long as the files I'm using as source for data are 
relatively small (30MB). When the files become large I see that the data 
received in general_work() function is corrupted. It is not massively corrupted 
but enough to screw my work.

Investigating the problem, I did a small test with having the 6 files filled 
with known patterns and printing an error in the general_work() if what is 
received is different. The result is that if I use files with sizes over 100MB 
I see 50-80 errors in total.

Now, to unblock my work I did observe that if I insert some printing in 
general_work() I will not get the errors. Going further, inserting a boost 
delay of 100uS also solves the problem.

some more data: 
1. I know the new blocks should use work() instead of general_work() but is it 
still supposed to work as long as I call consume_each(), right?

2. I'm using core i7 2900K if that matters

This error has haunted me for long time but now I finished my work and thinking 
to look for the error (is it scheduler, the way data is taken from the file 
sources, where it should be?) and maybe fix it.

Thanks,
Bogdan







reply via email to

[Prev in Thread] Current Thread [Next in Thread]