[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Backend <-> GUI Library Interaction
From: |
Richard Frith-Macdonald |
Subject: |
Re: Backend <-> GUI Library Interaction |
Date: |
Tue, 7 Nov 2006 07:01:48 +0000 |
On 7 Nov 2006, at 00:55, Christopher Armstrong wrote:
Hi
On Mon, 6 Nov 2006 12:16:49 +0000, "Richard Frith-Macdonald"
<richard@tiptree.demon.co.uk> said:
Neither ... it should set it's level to be 50 and then place A
immediately above B
That's interesting, I thought the only way a window could change
window
levels was by the setwindowlevel: message.
I thought the same until I checked. The documentation suggested that
behavior but was not completely explicit, so I looked at the X
backend code to confirm it.
The other interpretation (that ordering of windows relative to each
other only worked for windows in the same level) seems a bit more
intuitive to me.
It actually might be worth checking what MacOS-X actually does, and
consider changing the GNUstep behavior if it does it the other way.
In the AppKit, there are two notifications,
NSApplicationDidBecomeActiveNotification and its counterpart
NSApplicationWillBecomeActiveNotification. Are these supposed to be
sent
through an application when one of it's windows is made
"active" (i.e.
becomes key)? Or is it when an application begins responding to
event
messages?
They should be set when the app becomes active/inactive ... ie at the
point when it displays/hides its menus and panels.
So, if an app is inactive, and you click on any of its windows making
that window key, you will also be making the app active and the
notification would be sent, but if the app is already active and you
click on a different window, changing it to be key, then the
notification is not sent.
I note there appears to be no private AppKit events defined in
NSEvent.h
for these
nor could I find any instance in gnustep-gui when these are
actually sent. Are they currently being used? I would imagine that
they
would be used by an application when a user cycles between apps
running
on a system e.g. I was using the Mac's at uni the other day, and I
opened up Terminal with a main window (a terminal) and a font panel.
When I switched to another application, the font panel disappeared.
When
I switched back, the font panel reappeared. I'll have to check the
behaviour of some of these apps on Linux.
There is GSAppKitWindowFocusIn to tell the gui when a window is given
focus. This is really a hack to cope with X ... since what the gui
really wants to see is a mouse click in the window.
The notifications are sent by NSApplication ... the backend doesn't
need to do that ... it could call the NSApplication
activateignoringOtherApps: method to activate the app, but activation
is probably done implicitly (ie the backend tells the gui that a
window has been given focus, and the gui noticing that it is inactive
but has just had a window become key, changes to become active.
When focus leaves a window without being given to another window of
the app first, the backend tells the application to deactivate itsself.
Should I add some events to gnustep-gui which notify the frontend of
these changes, or should I just post the notifications myself from the
backend?
You don't need to do anything ... the NSApplication will post the
notifications.
Win32 provides some window messages such as WM_ACTIVATE (a window
in an
application was given/lost the keyboard focus), WM_ACTIVATEAPP (your
application itself lost/gained the keyboard focus) and
WM_WINDOWPOSCHANGING/WM_WINDOWPOSCHANGED (called for a litany of
window
resize/move/focus/Z-order events where the outcome can be
"adjusted"). I
think these could help.
<snip>
It sounds like WM_ACTIVATEAPP should trigger app activation and
WM_ACTIVATE should tell the app to change the key window and the
others should do window movement ... I would have thought that the
win32 backend was already using these in some way ... I guess not
optimally though.
Actually, not really at all. They are stubbed out though for "future
implementations". Only some more basic events appear to be handled.
Perhaps they are not needed because other events provide the same
information implicitly?
For instance, WM_ACTIVATEAPP would be equivalent to a focus in event
for the key wndow,
However, I would have thought that WM_ACTIVATE was necessary ...
unless windows always sends a mouse click event when focus goes to a
window. If it always sends a mouse click then nothing else should be
necessary as the receipt of the mouse click should let the gui make
the window key.
There was another case I also forgot to mention. I believe NSPanel can
be inherited and its windows prevented from becoming "key" and
becoming
"main". How does gnustep-gui expect the backend to handle this? I'm
presuming it just resets the keyboard input focus if a window tries to
become key, or does it expect the backend to prevent this situation?
IIRC is done in the gui at present ... if focus is given to something
which shouldn't have it, the gui resets focus to the key window.
I guess the backend could do the same thing without ever bothering
the gui layer ... but that would require some duplication of code to
check whether the window given focus can accept it, so it's probably
best left to the gui to handle it.