[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: List of directories under a "top-level" module

From: Arthur Barrett
Subject: RE: List of directories under a "top-level" module
Date: Thu, 9 Oct 2008 19:21:18 +1000


> Subject: Re: List of directories under a "top-level" module

This subject does not appear in any way related to your e-mail.  Please
use a relavent subject line.

> damn update takes over 40 minutes. And since all machines start this
> update process at around the same time, there's some bandwidth crunch
> being recorded.

Update is very efficient - why is there any bandwidth cruch?  What is
the actual update command you are using?  Which exact CVS server and
client versions are you using?  What protocol between the client and the
server are you using (what is the CVSROOT)? What 'type' of files are
being updated (are they binary JAR files)?

Regardless of the file type and the number of files the update command
will only generate network traffic for files that have changed and the
network traffic is just a 'patch'.  It could be that you are using some
update option that is causing the whole sandbox to be rebuilt or your
build process is somehow 'touching' the files (causing the date/time of
the sandbox files to be different to the CVS/Entries which will cause
the client to send the file to the server for comparison) or perhaps
there is significant skew in your local and server time.

Your idea of tar/gz'ing the files that the first machine updates and
then sending those to each other machine implies that sending the ENTIRE
FILE will be quicker than sending the patches to the files (ie: what CVS
update does) - which just should NOT be the case.  If for some reason
the 'update' itself cannot be fixed you could just use the 'find'
command to find all the files with a modified date of today and tar/gz
them all (this would include the CVS control directories) and then
FTP/SCP that to each other machine and they should be identical. Eg:
    $ find -ctime 1 -type f -exec tar azf ~/today-`date`.tar.gz  {} \;

If the majority of files being updated are binary then rsync may be a
better way to synchronise them since its patch mechanism is more optimal
under those circumstances.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]