Re: Shorter and less error-prone rule for automatic prerequisite generat
Robert Jørgensgaard Engdahl
Re: Shorter and less error-prone rule for automatic prerequisite generation in the GNU Make manual
Thu, 29 Apr 2010 16:39:12 +0200
Edward Welbourne <address@hidden> wrote on 29-04-2010
> > Delete a "clean-depend" rule on sight,
> I cannot agree.
> If I write a rule to make something, I also write a rule to get rid
> it. It's just basic hygiene ...
> > or rename it to the more accurate "break-future-builds".
> If you have a sensible rule to generate .d files when needed, you
> haven't broken your builds - you've just obliged yourself to
> regenerate .d files. Which may be wasteful, but see below.
> > I can think of absolutely no reason
> > to delete dependency files without deleting the objects they
> Do an update from your version control system and watch the make files
> move to a new location in the directory tree. If your make files
> refer to source files via relative paths, your .d files are now
> broken, but the .o files are fine. (This is rare: I cite it
> it happens to be the most recent reason I've actually needed it in
> practice.) Crucially, if you *don't* have a clean-depend rule,
> can't move your make files without breaking version-control tools
> do bisect [*] or obliging them to do full make clean between each
> build. With a clean-depend rule, however, only the dependencies
> regenerated - and this cuts out all but the preprocessing step of
> compiler - except for the files that actually changed.
> [*] for those unfamiliar with bisect: modern version-control systems
> such as git include a command that, given two revisions between which
> a bug was introduced and a script that builds and tests for the bug,
> will automatically find the revision at which the bug was introduced.
> Back before disk got cheap, I needed a clean-depend rule for when
> disk got full during the last build and a bunch of .d files got
> corrupted because they only got half-written. And, before I
> the subtleties ironed out from our dependency tracking, it was also
> useful when debugging your dependency generation.
> Speaking of the subtleties of dependency tracking: do an update in
> your version control system, watch some header go away - and all files
> that used to reference it drop those references. Your .d files
> a bunch of stuff depends on this missing file; but you have no rule
> regenerate it. So make will not even try to compile anything
> though everything *would* compile fine) because your .d file say that
> all the .o files that need recompiled depend on a file that doesn't
> exist any more; make clean-depend fixes that. (I have a better
> for that, but I'll leave that for my manual update.) Without
> have to make clean to make any progress, or fix every .d file that
> references the lost .h file.
If an update to new source code, that would compile
just fine in a clean checkout, breaks the incremental build, the build
system is errornuous. Anything that seeks to fix such bugs by user intervention
is not a real solution. At least that is my opinion. I just don't
know how that could be solved nicely in Make.
> >>> By providing a rule for the .d files you'll cause make
> >>> re-execute itself, so you'll just end up parsing the
> >> Only if you include .d files that aren't really needed (which
> >> common practice).
> > There's still the case of a .o and .d pair that both exist where
> > .o is out of date to one of its dependencies. If the .d
file is a
> > target, then make will first rebuild the .d, re-exec, then rebuild
> > .o. The re-exec is unnecessary there; building the .d as
> > side-effect of building the .o is sufficient.
> If generating .d as a side-effect, don't listen to the manual's advice
> that says you need to sed its content to claim that the .d depends
> the same things the .o does. If those things have changed, the
> shall be regenerated and hence so shall the .d; and you don't need
> this updated version of the .d file to discover that the .o needs
> rebuilt. Changes to .h files consequently never trigger re-exec. >
> That just leaves the case of where the .d and .o are both out of date
> relative to their primary source file, when both shall still, indeed,
> be rebuilt. I can't think of a clean way to get rid of this
> practice, have never seen it as a big problem. A few source
> have changed, they get preprocessed twice and make re-execs in
> between. It's a waste, but you can avoid it by deleting the
> matching any source file you edit, so that its .d gets ignored - and
> don't bother even with that. It's more of a cost when updating
> version-control and lots of stuff has changed; my response to that
> to kick off a build in background as soon as I update, so that it'll
> have sorted itself out by the time I actually care (i.e. I've made
> edits and am ready to build what I want), at which point we're back
> few affected compilation units.