[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [GNUnet-developers] Proposal: Make GNUnet Great Again?

From: Christian Grothoff
Subject: Re: [GNUnet-developers] Proposal: Make GNUnet Great Again?
Date: Sat, 9 Feb 2019 13:38:38 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Thunderbird/60.4.0

On 2/9/19 1:06 PM, Hartmut Goebel wrote:
> Assume we have a huge repo:
>   * The total number of build-triggers is the same as for a smaller
>     repos (assuming each push is a trigger).
>   * The build-time of each repo (much) longer, since the whole repo will
>     be build from scratch. Since there are no files from a last build,
>     everything has to be build.

Is that true? autotools can re-build based on timestamps that have
changed for like forever. With Buildbot, I can certainly do incremental
builds, I am not forced to do a "make distclean" every time. Similarly,
build triggers do not have to be as coarse as "any push", I could
specify a push to directory X triggers tests (make check) in directories
X, Y and Z, or not?

If the CI requires always building every repo from scratch and always
running all tests, maybe the CI is to blame? IIRC with Buildbot, you do
have more control than "always redo everything".

>   * Developers get the CI results later sitting around waiting for the
>     result. (One of my projects takes 1:30 to finish CI, which is nerving.)

Agreed, but faster tests, parallel tests and selective tests based on
dependencies (which can theoretically even be decided automatically)
seem to me like the smarter solution here.

>   * When packaging (.deb,.rpm, guix), huge repos/archives are much more
>     nerving to package: Build-time is long, test-time is long, and if
>     anything fails or new patches are required, you'll start again.
>     (Some of the KDE packages take 15 Minutes to build. Iterating this
>     really woes!)

I'm not convinced that one big build is really much worse here than 50
small ones.

>     (When configuring gitlab-CI some of the issues could be solved, see
> Also from a developers perspective, a huge repo has some drawbacks: E.g.
> when switching branches or bi-secting, git will touch a lot of files
> which all need to be rebuild, which is taking time.

Granted, but your Git-driven bisection totally becomes much less useful
if you first have to identify which of the 50 repos really is the cause
of the regression.  So here you are trading touching files for the power
to more easily identify non-obvious sources of bugs.

Attachment: signature.asc
Description: OpenPGP digital signature

reply via email to

[Prev in Thread] Current Thread [Next in Thread]