automake
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Automake violations of the gnu coding conventions


From: K. Richard Pixley
Subject: Re: Automake violations of the gnu coding conventions
Date: Tue, 19 Jun 2007 13:01:18 -0700
User-agent: Thunderbird 1.5.0.12 (X11/20070604)

Harlan Stenn wrote:
And this situation is even more layered - I am using GNU AutoGen for one
big project, and I do not want to require my other developers to install
it.  Therefore I check in my autogen-generated files and we use a
'bootstrap' script after doing a code checkout to make sure the
timestamps are OK.  I have a similar situation where this same package
must be developed on stock Windows boxes, so I also check in the files
generated by yacc/bison.
Yes, this is a very similar use case to the ones I'm using.
I'd love to have a better solution to these problems, but at least I
have a workable solution.
I'm aware of several.

The one we used back at Cygnus was simply to assume that we were all developers and to require all tools, including non-standard tools. Since we were creating a fully integrated tool chain release, we could simply supply any nonstandard tools that we needed, (byacc, m4, etc). What we stored in CVS was simply the source, no generated files. Then to create FSF style releases, we'd build a package in $(srcdir), then "make distclean" to remove all of the .o files, while leaving the generated files. There was also a makefile target for building the tarball and for encoding the version numbers. This approach works reasonably well if you are the authoritative maintainer of the code you're releasing. It doesn't work very well for staged releases.

For staged releases, where I'm maintaining something that also has an "upstream" maintainer, and my local copies may or may not include local changes, and our task is to create an integrated release, (think cygnus, debian, redhat, or ubuntu), a more direct approach is to run configure once and save all of the resulting files, right down to the Makefiles, and then break all of the dependency rules involving Makefiles, configure, automake, etc. For commercial packages, or highly integrated systems, (like debian), configure becomes a liability and doesn't really add any functionality. The downside of this approach is that your local copy has all of these gratuitous Makefile and configure differences from the upstream versions, which makes merging followup upstream versions more tedious than is necessary.

Another approach is simply to fix the local copy of automake. In the absence of the circular dependencies, (Makefile.in depends on Makefile depends on Makefile.in or configure depends on makefile depends on configure), these problems go away. The down side is that you have to maintain your own copy of automake.

You'll also find that random other packages insert silliness like these circular dependencies into their makefiles for other files which are generated by non-standard tools. In my opinion, the right place to fight this bitrot is with the upstream maintainers.

--rich




reply via email to

[Prev in Thread] Current Thread [Next in Thread]