discuss-gnustep
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: SwiftUI compatibility APIs in GNUstep's graphics stack (Was: Which O


From: Ivan Vučica
Subject: Re: SwiftUI compatibility APIs in GNUstep's graphics stack (Was: Which ObjC2.0 features are missing in the latest GCC?)
Date: Wed, 27 Nov 2019 19:12:17 +0000

On Wed 27 Nov 2019 at 18:58, David Chisnall <address@hidden> wrote:
On 27 Nov 2019, at 18:29, Ivan Vučica <address@hidden> wrote:
>
> I will intentionally not discuss this further, and I have
> intentionally not dug very deep. I don't want to be overly exposed to
> ideas beyond the APIs.

Apple wrote quite a lot publicly about how their fast rendering server (Quartz Extreme!!11eleventyone) worked, and it’s very similar to how most modern GUIs work.  XDAMAGE, XRENDER and XCOMPOSITE expose all of the things required to do the same on X11.  Their main innovation was that, for each font, they pre-rendered all of the glyphs and did server-side compositing. 

Oh, I’m referring to none of these obvious things. If we sit over a beer, you might be mildly amused for me not talking directly in an archived list — it’s not a genius idea they’ve done, the thing I’m thinking about, but it’s there.



On a modern GPU, modifying a texture is very expensive, but compositing is basically free.  On a CPU, modifying a texture is cheap and compositing is relatively expensive.  The interface that you want from the GUI stack to the display server (X11 or whatever) should favour storing rendered things on the server and should provide the regions that need redrawing.  If you’re using a GPU, rendered things can be stored as textures and composited for free.  If you need to redraw something, you can render to texture and then cache the result and composite that.

^ this

I haven’t looked at the Cairo back end, but from what I’ve seen of its behaviour, my guess as to some of the performance issues:

 - It is redrawing a lot more than it needs to.  There may still be the double buffering inherited from the ART back end, but I’ve seen the entire window flicker when only a small portion should be redrawn.

- There’s no CALayer support, so redrawing a view always involves redrawing all subviews, even ones that haven’t changed. 

But it counters this by repainting only the chunks that changed instead of the whole unobscured view tree.

And repainting the whole view tree is the easiest approach once you get GPU and compositing involved. Sometimes determining if a view / layer is obscured can get tricky; a view under a .5-opacity pixel is unobscured.

I don’t know what’s better, honestly.


Pretty much all other modern toolkits have sone CALayer equivalent and so can cache expensive to render and just recomposite them.  This is particularly noticeable over remote X11, where those images are cached on the display server and only the redrawn bits need to be transferred over the network.

CAAppKitBridge should be our way out of this — the performance issues will then move into Opal, Opal backend and into our CA. This needs to be finished to see whether it’s worth it. At that point, making the whole view tree layer-backed involves NSWindow telling its contentView that it wantsLayer. [Well, an internal equivalent so we don’t confuse apps that may try to read the property.] We can easily have a userdefaults knob to trigger this.

-back and -gui can *mostly* be unaware of this, which is probably for the best.
--
Sent from Gmail Mobile

reply via email to

[Prev in Thread] Current Thread [Next in Thread]