lmi
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lmi] VCS caching


From: Vadim Zeitlin
Subject: Re: [lmi] VCS caching
Date: Wed, 18 Apr 2018 01:08:26 +0200

On Tue, 17 Apr 2018 22:25:47 +0000 Greg Chicares <address@hidden> wrote:

GC> On 2018-04-16 23:58, Vadim Zeitlin wrote:
GC> [...]
GC> >  Sorry, but I'm afraid there are still a couple of problems left and I've
GC> > created https://github.com/vadz/lmi/pull/82 to address them.
GC> > 
GC> >  The first two commits should be really uncontroversial as they just fix
GC> > small but real mistakes when using git-clone and git-fetch respectively.
GC> 
GC> Cherry-picked into lmi master, and pushed.

 Thanks!

[...]
GC> The reason why I don't want to apply this particular change is
GC> that it seems to restore only some of the careful, validated work
GC> we had done.

 I really don't want to waste even more of your time, but I'd just like to
notice that needing just this part instead of the more complicated logic is
one of the benefits of using a single repository, as we do now, compared to
using a bare repository and a separate checkout before. I.e. the fact that
it restores only part of the old logic is a feature, not a bug.

GC> Let me offer a suggestion that I think will reduce your use case
GC> to the one that the simplified 'install_wx.sh' fully supports:
GC> write a separate script to clone the repository you keep on your
GC> LAN into your /cache_for_lmi/vcs directory. Do that once, and
GC> then that directory is a complete proxy; it shouldn't need to
GC> connect to the central server on github unless submodules change.

 Yes, sure, I've already created the local mirror in my VM and when I need
to do it again in another VM, I'll just reapply this patch, which I'll keep
locally. It's not a big problem for me, I just thought that the changes
were harmless (and simple) enough to be applied nevertheless but I
understand perfectly well that you don't want to spend even more time on
this, so let's officially close this thread.

 Thanks,
VZ


reply via email to

[Prev in Thread] Current Thread [Next in Thread]