[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: GNUstep GUI design
From: |
Pascal Bourguignon |
Subject: |
Re: GNUstep GUI design |
Date: |
Mon, 7 Oct 2002 00:55:13 +0200 (CEST) |
> From: Stefan Urbanek <urbanek@host.sk>
> Date: Sun, 06 Oct 2002 20:09:36 +0100
>
> Hi all,
>
> On 2002-10-04 19:42:05 +0200 Yen-Ju Chen <yjchenx@hotmail.com> wrote:
> [snip]
>
> > And as personal opinion, the idea of unix-like tool works very
> > will in command line
> > But in gui environment, I doubt it.
> > Switch between applications is not as convenient as tools.
> > (takes time to launch, mess up the desktop with too many app, ...)
>
> I can imagine a system without a command-line that works. I do not
> have a commandline for my surronding environment, neither terminal
> for my desk. I see the future in something like 'interacting
> objects' and we can consider gnustep applications like objects-tools
> that we use to do our work (by manipulating objects).
>
> I am using a terminal a lot, I can say that it is my primary working
> environment. But it does not mean that it good or comfortable, it
> means that there are no suitable GUI tools that can ease my
> work. Current GUI tools, or applications if you want, do _not
> cooperate_ together well. Why to use the terminal? Mostly because
> one can chain his work using scripts and pipes and it is fast. GUI
> is _still_ not.
MacOS for example. Amiga, Atari too.
NeXTSTEP/OPENSTEP/GNUstep/MacOSX have a terminal, so experimented
users can go to a shell and write scripts when the task at hand is
repeatitive and do in 3 minute what Mac users do in 3 hours. That
happens more often than you would think... That's the advantage of
NeXTSTEP: you have the best of GUI without loosing the CLI.
Apple later tried to add scripting to its MacOS (AppleScript) and
they're still developping it in MacOSX. However it's not really as a
workable solution as CLI based. Why?
I too have head this idea that everything in computer systems is an
object and that you'll manipulate these objects, sending them
messages, using specialized editors for each kind of objects. In a
way, that's what you do with documents and applications, and Macintosh
could even store all document as an application thus making a true
object (see for example the self extracting archives) easily with its
forks (data for the document, resources for the application code and
other resources; this has changed with powerpc though). Note that in
the case of the Oberon system, you could send messages to any object,
but you usually did it writting the message statement in a text window
and executing it. Kind of CLI to my taste... ok, Oberon is not
exactly a GUI environment. But still, why?
Another case is the services in NeXTSTEP. Here, the data part of the
"objects" is represented by a selection in any application. This
selection has a type (text, picture, whatever), and depending on this
type, could be sent to this "object" various messages named "Services"
listed in the services menu. Granted, these kind of objects where not
very well encapsulated, since each application could be considered as
a set of methods able to act on objects whose data part is stored into
the document files (speaking on a global level, not about the
Objective-C objects that could be stored inside the files; note that
in that case you just store the data part of the instances; the method
code stays in the applications or in the libraries).
So, this may be one of the difficulties for a good objective based
GUI: that the objects are split attribute here, methods there. On the
other hand, we are considering some very big methods here (the
Netscape Communicator method is 22 MB!) obviously you cannot duplicate
this method in all the HTML objects (it's not done either in object
language run-time, even if conceptually it is).
But the main problem with object systems, is that you cannot take a
random object, such as one you could find on a filesystem, and start
sending it meaningful messages. Perhaps because it's hard to find
what message it can answer to (what method it as) because up to now,
these methods have been so far from the attributes of the object
(hiden in applications), or simply because the semantics of this
instance imposes messages that your GUI environment don't even know it
can send. Let say you encounter an UISXA from the OPWQIE universe.
What do you say him/she/it/whatever? What I mean here is that the
difficulty of the object system, is that you can significantly relate
to another object only if you know it. That's why you need
specialized (hand-crafted) editors (the applications) to work with
each kind of object, and why you're lost when you encounter an unknown
kind of object (you can easily find and download such object of
unknown kind from the Interent to see why I mean). If you say we can
have generic editors, I'd answer that they're not generic editors,
they're inspectors and if you can see and edit the attributes of an
object and see the list of methods it has, first, that does not mean
you can signficiantly interact with this object, and secondly, and
here is my second point, the fact that you, as an inteligent human,
can elaborate some sematics about this objects with this inspector
does not mean that a GUI system can do anything about it.
You'd need a AIGUI to attain this objective.
> It is more than 20 years since a graphical object environment was
> invented...and we are still using a terminal...
In conclusion, that's why the file is a sequence of character
paradigm, and a system of small tools working on these character
streams blissfully ignorant of any meaning of these characters, but
that you (an inteligent human) can chain and combine easily (thanks to
redirections, pipes and scripting), still is superior to any GUI and
object system you can find.
> > And interface between application is not as good as tools (pipe).
>
> What about drag and drop or pasteboard? They are quite good
> analogies of what we do every day. It is like moving things around
> and it is same for files and objects. Only thing that is needed (as
> I see it) is that developers have to get used to it. With ~step it
> is really easy as compared to other environments.
Ok, I've another theory. (For all these discussions, you can forgot
Microsoft, since they only copy the others; at least Apple in addition
to copy Xerox did improve and start the diffusion of this kind of UI).
Now examine the Mac GUI (NeXTSTEP would do too). Apple (or Steve Jobs
if you want, since he was also at the head of NeXT), sells computer
and to have its users able to use their computers, it developped one
application: the Finder (WorkspaceManager). (The rest is only
marketing tricks to involve third party developers). To develop this
one application, they developped their own library of GUI objects.
Note that they could have used an existing library, like Atari did,
but Apple is a vertical corporation, and seeing the fate of Atari,
it's probably better they did it themselves. So, now we have one
application, and where do you draw the line separating the application
and the library? That is, the purpose of naming some functions a
library is to allow them to be used by third party. You want them to
use these same function to keep a common look and feel to the benefit
of your buyers (hence yours), but you don't want to give everything to
the third party, because they could easily became concurent. On the
contrary, we see often that Apple stomps on the markets developped by
its third parties. It happens that the line has been drawn under the
icons. Look how you have at your hand text fields, buttons, menus,
windows, everything you have in the Finder, BUT the icons. You don't
have an IconManager in the MacOS Toolbox. You don't have an NSIcon
class in OPENSTEP (granted, on NeXTSTEP/OPENSTEP, it's much easier to
implement an icon because NSButton do most of the work). The
consequences of that, is that there have been almost no concurent to
the Finder, and at least none have had the same ergonomical icons,
good for Apple, but worse for the historical evolution of GUIs and
object systems, there have been no other applications where you have
objects (represented on screen as icons) that the user can send
messages (-[Item selectYourself], -[Item openYourself], -[Trash
takeThisItem:(Item*)i], -[Printer printThisItem:(Item*)i], etc). Note
that Apple did not hesitate to make new classes of objects, such as
printers, or remote volumes available on the desktop when it pleases
itself, along with new messages (they don't overdo it neither). But
in terms of mindshare this is not something they've advocated
(evangelized) and no other (third-party) application ever does it.
For example, while it would be more awkward than a simple '|' between
to commands, it could be quite easy to have a "pipe" class and pipe
items that you would instantiate on the desktop, and you could drag a
window (the output of an application) to this pipe icon, and then drag
this pipe item over another application an pray that they would
process the objects going thru the pipe from one application to the
other. Perhaps you would need the templates icon we had on Lisa to
represent the classes and easily instanciate new items... You could
also try commander of MPW to see what kind of work the user would have
to do to configure a chain of applications! (commander was a tool of
MPW that would display a GUI dialog with all the options of the other
CLI commands of MPW, thus allowing the users to use in GUI fashion MPW
(and unix in A/UX) commands). That's another problem, the same that
AppleScript has, that an application has its own quirks, and is
designed with the event loop and the human user in mind, and not with
scripting and batch processing modes.
On NeXTSTEP there is a graphical script editor that graphically links
(redirects/pipes) icons representing files and programs. It's harder
to use than a text based syntax.
Now, to finish with the integration of CLI and GUI, I would say that
it's possible to develop small GUI programs that integrate well with
CLI. This is the case with X, where you can launch most of the X
programs from the CLI with meaningful options and dataflow. Not the
latest bloatware though. Think about xmessage, Tk, etc. See also
NeXTSTEP copy and paste commands that gave access to the pasteboard
from the CLI and thus allowed integration of GUI with CLI.
> > It can be achieved via service, but takes time,
> > especially for something like the communication between MusicBox and Encod.
> > MusicBox need to tell Encod which track to encode,
> > the song/artist/album, the place to store the encoded song, etc.
>
> Why? Well, maybe I do not understand what has a player to do with encoding.
> (just asking)
>
> > The best situation I think is that MusicBox, Encod, and GSBurn
> > make all the basic funcation (playlist, encoding, burning) into bundles.
> > Any application just pick up these bundles,
> > assemble them, intergrate the interface between bundles,
> > build the GUI, then you have anything you want.
>
> Not a bad idea. Another very similar solution may be a framework
> providing encoding or burning mechanisms with something like
> back-end bundles for specific codecs or devices. Then the
> application does not have to know and care about particular
> bundles. Framework will provide a list of capabilities that will an
> application present to the user.
>
> [skip]
>
> Stefan
--
__Pascal_Bourguignon__ http://www.informatimago.com/
----------------------------------------------------------------------
The name is Baud,...... James Baud.