[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnash-dev] OpenGL renderer broken on FreeBSD

From: Dmitry Marakasov
Subject: Re: [Gnash-dev] OpenGL renderer broken on FreeBSD
Date: Thu, 22 May 2008 06:38:08 +0400
User-agent: Mutt/1.5.17 (2007-11-01)

* Bastiaan Jacques (address@hidden) wrote:

> Given that the performance with nv, as you noted, is much better than
> with your card, I would strongly suspect your issue is a bug in the
> nvidia driver, and perhaps in the X server also. (I am aware that other 
> people with proper hardware acceleration have been using this code for 
> quite a while.)
That is not impossible, but I really would like to get to the cause of
this problem.
The problem happens on X server side for sure - now I'm running gnash on
another machine with X forwarded to local display. No CPU load on gnash
side, but 100% CPU load by X server. No significant traffic either.
This hang happens after drawing:

for (int i = 0; i < numPoints; ++i) {
  // draw scene

glAccum (GL_RETURN, 1.0);

if (_video_frame._frame.get()) {
    // this is never called
    reallyDrawVideoFrame(_video_frame._frame.get(), &_video_frame._mat, 

// now we'll hang here in glGetError()

Seems like it'll hand on any gl function, as if I comment glGetError
out it'll freeze somewhere else. Also it freezes for almost a whole
number of seconds (sometimes 7, sometimes 8).

Maybe this will give a hint for somebody aware of X internals, while I
play with the code to either fix this or revert to non-antialiased mode.

> That said, the presence and type of anti-aliasing may become
> configurable in future versions of the OpenGL renderer.
That's good to hear, it'll be useful anyway for performance reasons. The
best thing would be to be able to toggle it in runtime.

Dmitry A. Marakasov    | jabber: address@hidden
address@hidden       |

reply via email to

[Prev in Thread] Current Thread [Next in Thread]