bug-gsrc
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [bug-gsrc] /srv/bzr/gsrc/trunk r764: revert to make-3.81 due to back


From: Brian Gough
Subject: Re: [bug-gsrc] /srv/bzr/gsrc/trunk r764: revert to make-3.81 due to backwards incompatibility of 3.82
Date: Sun, 23 Oct 2011 20:48:15 +0100
User-agent: Wanderlust/2.15.6 (Almost Unreal) Emacs/23.3 Mule/6.0 (HANACHIRUSATO)

Thanks for the explanation. Do you have an account on savannah? I can
add you to the project so you can commit directly.

At Mon, 17 Oct 2011 19:36:56 -0400,
Luke Shumaker wrote:
> 
> [1  <text/plain; US-ASCII (7bit)>]
> At Mon, 17 Oct 2011 14:30:21 +0100,
> Brian Gough wrote:
> > At Sun, 16 Oct 2011 23:49:27 -0400,
> > Luke Shumaker wrote:
> > > The attatched patch should make GSRC work just fine with make 3.82.
> > > 
> > > Now, instead of using
> > >  $ make http//example.com/foo.tar.gz
> > > it's
> > >  $ make download/http/example.com/foo.tar.gz
> > > 
> > > While it's a little more... verbose, I like it because it seems
> > > slightly less "magic"; protocols are foreign to make, directories aren't.
> > 
> > Hi Luke. Can you explain how this works?  I thought the problem was
> > that if a filename like 'patch-3.18.tar.gz' matched a target like
> > 'patch-%' it triggered the wrong rule, which seemed impossible to work
> > around.  I've tried "make -C gnu/patch" and I get an infinite loop
> > using make-3.81.
> > 
> > address@hidden:/home/gnu/gsrc$ make -C gnu/patch
> > make: Entering directory `/home/gnu/gsrc/gnu/patch'
> >  ==> Grabbing download/patch-2.6.1.tar.gz
> >      ==> Trying download/file/files/patch-2.6.1.tar.gz
> > <snip>
> 
> Make 3.82 will select the rule with the shortest stem (the %); in
> other words the rule that is most specific, or the longest. (Directory
> snipping makes latter definition easier to explain).
> 
> "patch-%" matches 6 non-stem characters, but "http/%" only matches 5,
> so make selects the former. To work around this, all we have to do is
> make the latter longer; for example by prefixing it with "download/".
> "download/http/%" will match 14 non-stem characters, so it will be
> selected.
> 
> This method is great, because it makes the downloads look more like
> ordinary files, which make deals with. Ideally, we'd integrate it a
> little better with the garchive system used by GAR, but that patch is
> a little less trivial.
> 
> Now, for the prefix to tack onto the urls, I just chose
> $(DOWNLOADDIR), which creates a collision between the URLS and
> "$(DOWNLOADDIR)/%", which selects from a number of URLS to fetch the
> file.
> 
> This is fine in make 3.82; "$(DOWNLOADDIR)/http/%" is longer than
> "$(DOWNLOADDIR)/%", so make did the right thing.
> However, I forgot that make 3.81 would just select the first pattern,
> "$(DOWNLOADDIR)/%", which throws it into an infinate loop (I'm
> actually surprised that make doesn't detect the loop, but my approach
> to learning make's safety mechanisms is that if you need them at all,
> you're doing it wrong).
> 
> Anyway, here's a another patch (instead of, not on top of the old one)
> that uses "cache/" instead of "download/".
> 
> ~ Luke Shumaker
> 
> [2 make382-fix-take2.patch <text/plain; US-ASCII (base64)>]
> === modified file 'gar.lib.mk'
> --- gar.lib.mk        2011-08-08 16:47:44 +0000
> +++ gar.lib.mk        2011-10-17 19:50:27 +0000
> @@ -52,11 +52,12 @@
>  SIGFILE_URLS = $(foreach DIR,$(FILE_SITES) $(SIGFILE_DIRS) 
> $(MASTER_DIRS),$(addprefix $(DIR),$(SIGFILES)))
>  PATCHFILE_URLS = $(foreach DIR,$(FILE_SITES) $(PATCHFILE_DIRS) 
> $(MASTER_DIRS),$(addprefix $(DIR),$(PATCHFILES)))
>  
> +ALL_URLS = $(DISTFILE_URLS) $(SIGFILE_URLS) $(PATCHFILE_URLS)
>  # FIXME: doesn't handle colons that are actually in the URL.
>  # Need to do some URI-encoding before we change the http:// to
>  # http// etc.
> -URLS = $(subst ://,//,$(DISTFILE_URLS) $(SIGFILE_URLS) $(PATCHFILE_URLS))
> -
> +URL_FILES = $(subst ://,/,$(ALL_URLS)) # This is where any encoding happens
> +URLS = $(addprefix $(DOWNLOADDIR)/,$(URL_FILES))
>  
>  # Download the file if and only if it doesn't have a preexisting
>  # checksum file.  Loop through available URLs and stop when you
> @@ -84,23 +85,23 @@
>  WGET_OPTS = -c --no-check-certificate --passive-ftp -U "GSRC/1.0"
>  
>  # download an http URL (colons omitted)
> -http//%: 
> +$(CACHEDIR)/http/%:
>       wget $(WGET_OPTS) -O $(DOWNLOADDIR)/$(notdir $*).partial http://$*
>       mv $(DOWNLOADDIR)/$(notdir $*).partial $(DOWNLOADDIR)/$(notdir $*)
>  
>  # download an https URL (colons omitted)
> -https//%: 
> +$(CACHEDIR)/https/%:
>       wget $(WGET_OPTS) -O $(DOWNLOADDIR)/$(notdir $*).partial https://$*
>       mv $(DOWNLOADDIR)/$(notdir $*).partial $(DOWNLOADDIR)/$(notdir $*)
>  
>  # download an ftp URL (colons omitted)
> -ftp//%: 
> +$(CACHEDIR)/ftp/%:
>       wget $(WGET_OPTS) -O $(DOWNLOADDIR)/$(notdir $*).partial ftp://$*
>       mv $(DOWNLOADDIR)/$(notdir $*).partial $(DOWNLOADDIR)/$(notdir $*)
>  
>  # link to a local copy of the file
>  # (absolute path)
> -file///%: 
> +$(CACHEDIR)/file//%: 
>       @if test -f /$*; then \
>               ln -sf "/$*" $(DOWNLOADDIR)/$(notdir $*); \
>       else \
> @@ -109,7 +110,7 @@
>  
>  # link to a local copy of the file
>  # (relative path)
> -file//%: 
> +$(CACHEDIR)/file/%: 
>       @if test -f $*; then \
>               ln -sf $(CURDIR)/$* $(DOWNLOADDIR)/$(notdir $*); \
>       else \
> @@ -118,26 +119,26 @@
>  
>  # Using Jeff Waugh's rsync rule.
>  # DOES NOT PRESERVE SYMLINKS!
> -rsync//%: 
> +$(CACHEDIR)/rsync/%: 
>       rsync -azvL --progress rsync://$* $(DOWNLOADDIR)/
>  
>  # Download a directory tree as a tarball.
>  RSYNC_OPTS ?= -az
>  RSYNC_PATH ?=
> -rsynctree//%:
> +$(CACHEDIR)/rsynctree/%:
>       mkdir -p $(DOWNLOADDIR)/rsync
>       rsync -v --progress $(RSYNC_OPTS) $(RSYNC_PATH) $(DOWNLOADDIR)/rsync
>       cd $(DOWNLOADDIR)/rsync && tar -czvf ../out *
>       mv $(DOWNLOADDIR)/out $(DOWNLOADDIR)/$*
>  
>  # Using Jeff Waugh's scp rule
> -scp//%:
> +$(CACHEDIR)/scp/%:
>       scp -C $* $(DOWNLOADDIR)/
>  
>  # Check out source from CVS.
>  CVS_CO_OPTS ?= -D$(GARVERSION) -P
>  CVS_MODULE ?= $(GARNAME)
> -cvs//%:
> +$(CACHEDIR)/cvs/%:
>       mkdir -p $(DOWNLOADDIR)/cvs
>       cd $(DOWNLOADDIR)/cvs && \
>               cvs -d$(CVS_ROOT) login && \
> @@ -147,7 +148,7 @@
>  # Check out source from Subversion.
>  SVN_REVISION ?= "{$(GARVERSION)}"
>  SVN_CO_OPTS ?= -r $(SVN_REVISION)
> -svnco//%:
> +$(CACHEDIR)/svnco/%:
>       mkdir -p $(DOWNLOADDIR)/svn
>       cd $(DOWNLOADDIR)/svn && \
>               svn co $(SVN_CO_OPTS) $(SVN_PATH) $(DISTNAME) && \
> @@ -155,7 +156,7 @@
>  
>  # Check out source from Darcs.
>  DARCS_GET_OPTS ?= --partial --to-match "date $(GARVERSION)"
> -darcs//%:
> +$(CACHEDIR)/darcs/%:
>       mkdir -p $(DOWNLOADDIR)/darcs
>       cd $(DOWNLOADDIR)/darcs && \
>               darcs get $(DARCS_GET_OPTS) $(DARCS_PATH) $(DISTNAME) && \
> @@ -163,7 +164,7 @@
>  
>  # Check out source from Git.
>  GIT_REVISION ?= v$(GARVERSION)
> -git//%:
> +$(CACHEDIR)/git/%:
>       mkdir -p $(DOWNLOADDIR)/git
>       cd $(DOWNLOADDIR)/git && \
>               git clone $(GIT_PATH) $(DISTNAME) && \
> @@ -173,7 +174,7 @@
>  # Check out source from Mercurial.
>  HG_REVISION ?= $(GARVERSION)
>  HG_CLONE_OPTS ?= -r "$(HG_REVISION)"
> -hg//%:
> +$(CACHEDIR)/hg/%:
>       mkdir -p $(DOWNLOADDIR)/hg
>       cd $(DOWNLOADDIR)/hg && \
>               hg clone $(HG_CLONE_OPTS) $(HG_PATH) $(DISTNAME) && \
> @@ -182,7 +183,7 @@
>  # Check out source from Bazaar.
>  BZR_REVISION ?= before:date:$(GARVERSION)
>  BZR_CHECKOUT_OPTS ?= -r "$(BZR_REVISION)" --lightweight
> -bzr//%:
> +$(CACHEDIR)/bzr/%:
>       mkdir -p $(DOWNLOADDIR)/bzr
>       cd $(DOWNLOADDIR)/bzr && \
>               bzr checkout $(BZR_CHECKOUT_OPTS) $(BZR_PATH) $(DISTNAME) && \
> 
> === modified file 'gar.mk'
> --- gar.mk    2011-04-13 10:52:47 +0000
> +++ gar.mk    2011-10-17 19:30:06 +0000
> @@ -29,6 +29,7 @@
>  TEMPDIR = $(GARDIR_ABSOLUTE)/gar.tmp
>  FILEDIR ?= files
>  DOWNLOADDIR ?= download
> +CACHEDIR ?= cache
>  COOKIEDIR ?= cookies
>  WORKDIR ?= work
>  LOGDIR ?= logs
> 



reply via email to

[Prev in Thread] Current Thread [Next in Thread]