discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] Updates to gr-qtgui


From: Josh Blum
Subject: Re: [Discuss-gnuradio] Updates to gr-qtgui
Date: Wed, 13 Apr 2011 14:33:38 -0700
User-agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.14) Gecko/20110223 Lightning/1.0b2 Thunderbird/3.1.8

> If you find that it's not working for you or still producing segmentation
> faults on close, please let me know (and let me know your OS, CPU, and any
> other relevant features you can think of). I have run it on a few machines
> and various VMs, but it's a limited set. You can try the
> gr-qtgui/apps/pyqt_example_c.py as a test.
> 

So I put together a simple flow graph in grc, noise -> throttle -> qt
sink. I ran it repeatedly and alt + f4'd. Here are the results. Python
app attached. Ubuntu 10.10 x64 latest gnuradio master
e8ff9ef4bb77517428e1208ff4b3551a38107bbd

address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$ ./top_block.py
Segmentation fault
address@hidden:~/work/grc$ ./top_block.py
address@hidden:~/work/grc$

So it prints segfault when I exit the app for many of those runs. I had
also tried this test with a UHD block where the destructors had a print
in it. What I observed was the the destructor w/ the print was not
called most of the time, even when it didnt print segfault. So my guess
is that many of those seemingly successful exit events are not good.

Can you run my app in succession 20 times and see if it says segfault?

Thanks,
-Josh

Attachment: top_block.py
Description: Text Data


reply via email to

[Prev in Thread] Current Thread [Next in Thread]