[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[GNUnet-developers] Namespaces / GNML
[GNUnet-developers] Namespaces / GNML
Sun, 19 Jan 2003 05:18:39 +0000
Since my previous posting, I've been very silent, despite meaning to reply
to the responses you guys gave my suggestions. The intense cold here
the other week inhibited the functioning of the email-writing lobes in
my head. Well, at least that's the excuse I'm sticking to.
Well, now with the improved weather helping my brain thaw out, I'm finding
myself able to write what I need to, in the hope it's not too late yet
(not that I've got much of great use to add). Ahem:
A few weeks back, around the same time as thinking of those "informal
protocols", I wondered about the issue of how to have a fair idea of
whether something you're going to download is likely to be something
you want (eg what the file is described as), rather than content that
you do not want, or even files of random garbage. Not only is it a
waste of time to get such bad files, but downloading them helps to
spread their blocks across the network, reducing performance.
Obviously, it isn't really possible to verify a file without
downloading it first, but I felt that if you either knew who put it up,
or knew of somebody who endorsed the file as real, that was prolly
95% as good, as you'd get a feel of who's reliable.
My first thought, was to do the obvious thing and have files inserted using
PGP signatures as insertion keywords, so that you could search for files
inserted by someone with a given public key- but doh, you'd have to
download the entire file before you could check that the signature signed
the content. You could instead have the signature sign the file descriptor
thingy, but still, that would require those nodes answering queries to
know to check signatures, which would be new code. I was thinking of
something that was ultimately *application* layer, eg an augmentation
to gnunet-search and gnunet-insert. So I tried again.
I came up with a different approach: whilst I deplore Freenet's way of
doing everything through a browser+gateway, and the situation that you
couldn't search for anything (maybe that's changed now), the point
of having some content that was HTML linking to other Freenet content
did have a little merit-
An application-layer protocol could be set up, involving a new file
format, call it GNML (GnuNetMarkupLanguage), which would be a (heavily)
stripped down HTML-like format, where hyperlinks (and any embedded
images if used) would not be URIs but exclusively GNUnet key/hash data,
and the entire file would have a digital signature.
Programs such as gnunet-gtk (which is still buggy, btw), would then have
a browser-like applet, tailored to rendering GNML (actually, GNML would
be designed with the browser in mind, to avoid having to code anything
gnarly) and able to check the signatures, remind the user of who the
author is (it should be possible to map public keys to long contrived names?),
ask them whether to proceed with downloading+showing any inline
images, and start downloading any links clicked on.
What of gnunet-search, gnunet-download, etc? Well, they could simply
save the gnml file, and allow the user to run a stand-alone gnml viewer,
which could maybe print the links onto the command line.
Other thoughts on this include the possibility of GNML files actually
being ar archives (like .deb files are), containing both the hypertext
*and* any inline images it uses. Handy?
Then, not many days later I saw the namespace proposal on the website,
though I'd not heard it mentioned on the list. I'd consider the
coincidence creepy, except this sort of thing must be one of the main
TODO's for the project.
Anyway, there was the idea I'd had. I don't know whether there would be
any sense in using both schemes or not, there doesn't seem to be a
total overlap in the problem domains, so I'm throwing it into the
discussion before the discussion closes.
(Sorry it was such a long post)
- [GNUnet-developers] Namespaces / GNML,
Tom Barnes-Lawrence <=