[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[bug#44199] [PATCH 0/1] An origin method for GNUnet FS URI's

From: Maxime Devos
Subject: [bug#44199] [PATCH 0/1] An origin method for GNUnet FS URI's
Date: Tue, 27 Oct 2020 19:50:18 +0100
User-agent: Evolution 3.34.2

zimoun schreef op di 27-10-2020 om 14:39 [+0100]:
> Dear,
> Thank you for the patch.  My questions are totally naive since I do
> not
> know much about GNUnet.
Conceptually, the GNUnet file-sharing system is not unlike IPFS, dat,
or even torrents, although the details can vary wildly. I don't know
much more about GNUnet either except for how to install it, and how to
publish and search for files.

> On Sat, 24 Oct 2020 at 21:47, Maxime Devos <
>> wrote:
> > This patch defines a `gnunet-fetch' method, allowing for
> > downloading
> > files from GNUnet by their GNUnet chk-URI.
> > 
> > This patch does not provide:
> > - a service configuration
> > - downloading substitutes from GNUnet
> > - fall-back to non-P2P (e.g. http://) or other P2P (e.g. ipfs://)
> >   systems
> > - downloading directories over GNUnet
> This means it only works for archives as tarball, right?

More-or-less, yes. I would assume a zip archive would work just fine as
well, and in the rare case where the source code consists of a single
file (e.g. search for the %boot-logo-patch in (gnu packages linux)),
you don't need an archive at all.

GNUnet has a special format for directories, presumably supporting
deduplication (I haven't checked), conceptually not unlike ipfs and
dat. It is probably not too much work to support this format, but I
would like to keep the first patch simple.

> > - Would it be possible somehow for url-fetch to support
> > gnunet://fs/chk
> >   URIs? That way we could fall-back unto non-P2P URLs, which would
> > be
> >   useful to bootstrap a P2P distribution from a non-P2P system.
> Who is the “we”?  What do you mean by “url-fetch supports gnunet://
> and
> fall-back unto non-P2P”?

Presuming this patch is accepted in some form, ‘we’ refers to code in
Guix responsible for downloading archives. I should have formulated
that better. About the ‘fall-back unto non-P2P’:

The ultimate goal is to not rely on any centralised servers at all (not
even mirrors) (these are the non-P2P systems), and download all
archives and substitutes from GNUnet FS (or IPFS or dat, but I prefer
GNUnet). Of course, expecting everyone to stop using HTTP / ftp / etc.
for source distribution and instead use GNUnet immediately is foolish,
so ideally there would be a smooth transition path:

1. At first, maintainers would typically still publish the tarballs(*)
   on a centralised (possibly mirrored) servers, say
   <>. The package definition would use the
   url-fetch method.

2. If $MAINTAINER doesn't have access to or doesn't want to use any
   publicly-available distribution site (e.g. due to censorship, or for
   pseudonymity or simplicity), $MAINTAINER may instead use the
   gnunet-fetch method in the package definition.

   Alternatively, if the distribution site disappeared, Guix
   maintainers have the option to  point to GNUnet. (Or refer to swh,
   I presume, but my goal is to decentralise.)

4. $MAINTAINER finds GNUnet convenient. However, GNUnet hasn't achieved
   world-domination yet, so $MAINTAINER still publishes tarballs on
   <> or similar.  Actually, even if GNUnet *was*
   universally supported, a centralised server can still be useful:

   P2P file-sharing systems by design may have to throw away old data
   the local peer isn't interested in (due to limited disk space), and
   $MAINTAINER might stop publishing the source code over GNUnet.

   In such cases, a centralised server (and some mirrors) may be a
   useful back-up, although it would still be preferred to use a
   distributed P2P system (such as GNUnet or ipfs) when available
   to limit the cost of running the centralised server (and mirrors).

   (To limit the utilisation of the centralised server (and mirrors),
   sources downloaded from the server (and mirrors) should be published
   back into the GNUnet FS, but this can be as simple and crude as a
   cron job `gnunet-publish /gnu/store/*.tar.gz`.)

   My idea in this case is to specify the origin as:

       (method url-fetch)
       (uri (list "gnunet://fs/chk/etcetera"
              (string-append "mirror://gnu/hello-" version

   However, this patch defines gnunet-fetch as a separate method
   from url-fetch, url-fetch is left untouched, so url-fetch will
   just ignore the gnunet://fs/chk URI's.

   If I understand correctly, url-fetch is a special method, in that
   the downloading code of the Guix daemon is used, and not the code
   defined by the user. Nothing a `guix system reconfigure` cannot fix,
   but nevertheless rather inconvenient for testing.

> Some recent discussions are about content-address and fallback.  For
> example, roughly speaking ’git-fetch’ tries upstream, then the Guix
> build farms, then Software Heritage (swh).  For Git repo, it works
> because the address from Guix side to SWH is straightforward.  The 2
> other VCS –hg and svn– supported by SWH should be implemented soon…
> who
> knows! ;-)

\me missed these discussions

> The story about archives as tarball is a bit more complicated.  The
> main
> issue –as I understand it– can be summarized as: Guix knows the URL,
> the
> integrity checksum and only at package time the content of the
> tarball.
> Later in time, it is difficult to lookup because of this very
> address;
> and some are around: nar, swh-id, ipfs, gnunet, etc.
> Bridges to reassemble the content are currently discussed, e.g.,
>    <>
>    <>
> Well, today the fallback of tarball archive to SWH is not reliable.
> What is your question? ;-)

The question to which answer? I'm not quite which answer you're
referring to, so some guesses:

Q: How does Guix figure out the GNUnet URI from the Guix (or nar, I
   guess) hash?
A: Not automatically.  The gnunet-fetch method as defined in this
   patch needs to be passed the URI manually. However, an additional
   service for GNUnet can be written that uses the DHT to map Guix
   (or nar, or something else) hashes to corresponding GNUnet URI's.

   (I'm not volunteering (yet))

Q: What about automatically generated tarballs (e.g. from git
A: Not addressed by this patch. The intention is to be able to replace
   a http://*/*.tar.gz URL with a gnunet://fs/chk URI in package
   definitions; source code repositories aren't supported by this
   patch. (But perhaps a future patch could support this!)

Q: Integration with bridges?
A: Sounds nice, but I'm trying to keep things simple for the first
   patch! (And I haven't heard of disarchive before now.)

> > Then publish the source tarball of the package to the GNUnet FS
> > system:
> > $ guix environment --ad-hoc wget -- wget 
> >
> > $ gnunet-publish hello-2.10.tar.gz
> Naive question:  are packages only available on GNUnet?

Naive interpretations of this question, and answers:

* Is the only thing available on GNUnet, *packages*?

  While experimenting with GNUnet, I have seen some images published on
  GNUnet. GNUnet also has other systems that the FS system, but they
  aren't relevant to this path.

* Are packages *definitions* only available, on *GNUnet*?

  I'm getting my Guix package definitions from the git repository
  on Savannah, and I have never encountered any on the GNUnet
  FS system.

* Is package *source code* only available, on *GNUnet*?

  If someone published the source code (e.g. as a tarball) on GNUnet
  with `gnunet-publish hello-2.10.tar.gz`, it is only published(*) on
  GNUnet, and not somewhere else as well.

  (*) I don't know exactly when one can be reasonably sure the file
  will *remain* available for some time when published, except for
  keeping the GNUnet daemon running continuously.

  However, in practice, $MAINTAINER will publish the source code
  somewhere else as well (e.g. <> or perhaps ipfs).
  This patch doesn't automatically publish source code of built or
  downloaded packages on GNUnet, although that seems a useful service 
  to run as a daemon.

* If the gnunet-fetch method is used, will Guix try to get the source
code from GNUnet, and nowhere else?

  Ignoring substitutes, yes. This is a limitation of defining
  gnunet-fetch separately from url-fetch. I believe this has been
  addressed earlier in my response e-mail.

> All the best,
> simon

Likewise, maxime

Attachment: Maxime Devos.pgp
Description: application/pgp-encrypted

Attachment: signature.asc
Description: This is a digitally signed message part

reply via email to

[Prev in Thread] Current Thread [Next in Thread]