[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Shorter and less error-prone rule for automatic prerequisite genera

From: Edward Welbourne
Subject: Re: Shorter and less error-prone rule for automatic prerequisite generation in the GNU Make manual
Date: Thu, 29 Apr 2010 12:59:13 +0200

> Delete a "clean-depend" rule on sight,

I cannot agree.
If I write a rule to make something, I also write a rule to get rid of
it.  It's just basic hygiene ...

> or rename it to the more accurate "break-future-builds".

If you have a sensible rule to generate .d files when needed, you
haven't broken your builds - you've just obliged yourself to
regenerate .d files.  Which may be wasteful, but see below.

> I can think of absolutely no reason
> to delete dependency files without deleting the objects they describe.

Do an update from your version control system and watch the make files
move to a new location in the directory tree.  If your make files
refer to source files via relative paths, your .d files are now
broken, but the .o files are fine.  (This is rare: I cite it because
it happens to be the most recent reason I've actually needed it in
practice.)  Crucially, if you *don't* have a clean-depend rule, you
can't move your make files without breaking version-control tools that
do bisect [*] or obliging them to do full make clean between each test
build.  With a clean-depend rule, however, only the dependencies need
regenerated - and this cuts out all but the preprocessing step of the
compiler - except for the files that actually changed.

[*] for those unfamiliar with bisect: modern version-control systems
such as git include a command that, given two revisions between which
a bug was introduced and a script that builds and tests for the bug,
will automatically find the revision at which the bug was introduced.

Back before disk got cheap, I needed a clean-depend rule for when the
disk got full during the last build and a bunch of .d files got
corrupted because they only got half-written.  And, before I got all
the subtleties ironed out from our dependency tracking, it was also
useful when debugging your dependency generation.

Speaking of the subtleties of dependency tracking: do an update in
your version control system, watch some header go away - and all files
that used to reference it drop those references.  Your .d files claim
a bunch of stuff depends on this missing file; but you have no rule to
regenerate it.  So make will not even try to compile anything (even
though everything *would* compile fine) because your .d file say that
all the .o files that need recompiled depend on a file that doesn't
exist any more; make clean-depend fixes that.  (I have a better fix
for that, but I'll leave that for my manual update.)  Without it, you
have to make clean to make any progress, or fix every .d file that
references the lost .h file.

>>> By providing a rule for the .d files you'll cause make to
>>> re-execute itself, so you'll just end up parsing the Makefile twice.
>> Only if you include .d files that aren't really needed (which is
>> common practice).

> There's still the case of a .o and .d pair that both exist where the
> .o is out of date to one of its dependencies.  If the .d file is a
> target, then make will first rebuild the .d, re-exec, then rebuild the
> .o.  The re-exec is unnecessary there; building the .d as a
> side-effect of building the .o is sufficient.

If generating .d as a side-effect, don't listen to the manual's advice
that says you need to sed its content to claim that the .d depends on
the same things the .o does.  If those things have changed, the .o
shall be regenerated and hence so shall the .d; and you don't need
this updated version of the .d file to discover that the .o needs
rebuilt.  Changes to .h files consequently never trigger re-exec.

That just leaves the case of where the .d and .o are both out of date
relative to their primary source file, when both shall still, indeed,
be rebuilt.  I can't think of a clean way to get rid of this and, in
practice, have never seen it as a big problem.  A few source files
have changed, they get preprocessed twice and make re-execs in
between.  It's a waste, but you can avoid it by deleting the .o file
matching any source file you edit, so that its .d gets ignored - and I
don't bother even with that.  It's more of a cost when updating from
version-control and lots of stuff has changed; my response to that is
to kick off a build in background as soon as I update, so that it'll
have sorted itself out by the time I actually care (i.e. I've made my
edits and am ready to build what I want), at which point we're back to
few affected compilation units.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]