autoconf
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: autoconf: Does it meet these cross-platform goals?


From: Ed Hartnett
Subject: Re: autoconf: Does it meet these cross-platform goals?
Date: Tue, 04 Apr 2006 05:58:39 -0600
User-agent: Gnus/5.1006 (Gnus v5.10.6) Emacs/21.4 (gnu/linux)

Matt England <address@hidden> writes:

> Summary:
>
> I'm managing a newly-open-sourced project, and I'm looking to
> accomplish these goals:
>
> 1) Ensure the source packages can build on all systems

*All* systems? Really? How about my HP calculator? 

How about my TRS-80 from 25 years ago?

> 2) Ensure the application (binary) packages run on all supported systems
> 3) Ensure libraries we deliver integrate properly with other software projects
>
> Any feedback or guidance from this community would be greatly
> appreciated.  I provide further details, as well as more-specific
> questions, below.
>
>
> Details:
>
> I manage a significant, C++ based software project that is on the
> verge of presenting an open-source format.  Up until now, my group had
> tight control on the systems/environments (mostly Windows-MinGW, RHEL,
> Fedora, and Debian to this point) for which we built and tested our
> software.  No longer; we now believe we have to support most of the
> worlds' systems, representing a set of very heterogeneous
> environments, and that's not trivial.

Ha!

That's for sure!!

>
> I'm looking for feedback and guidance on what tools and paradigms are
> available to my C++ based, open-source project to best ensure
> consistent cross-platform build and execution capability.  I provide
> my research findings and general understanding of the issues below.
> Thanks in advance for any help.

Yes, you want to use autotools.

>
> (I plan on posting this note in a couple different email lists,
> forums, and newsgroups.  Please forgive me if my post is not
> appropriate for your community.  For what it's worth, I have found it
> difficult reaching a conclusion for this
> cross-platform-software-distribution issue, and I think it best to
> query several different communities involved with this stuff and get
> their combined take on the matter.)

Best thing to do is start up a flame-war. I'll begin: Captian Kirk's
middle name way "Roy" not "Tiberius!"

Just kidding...

> I break this issue down into these goals:
>
> 1) Ensure the source packages can build on all systems

This is not a real goal. There is some subset of *all* systems that
you care about, and there is some subset of that which you can test
on.

However, I find that if my package builds cleanly (and it does) on
recent versions of Sun, AIX, Linux, and HP-UX, that it will build just
about everywhere.

You need to identify your "must have" platforms (hardware + OS + OS
version + compilers + compiler versions = platform). Then you must
have one of each (at least) and set up automatic nightly testing on
those.

It is pointless to say "all systems" but it is reasonable to say "n
systems" where n is a known number, and you have access to all of
those n systems for testing.

Tell your boss you can't guarentee any system working unless you have
one for testing. But if you get a good spread of testing machines,
autotools might well work out of the box on a new machine. (Or it
might not, but take only a few minutes fixing.)

> 2) Ensure the application (binary) packages run on all supported
> systems

You mean, build automatic testing?

> 3) Ensure libraries we deliver integrate properly with other
> software projects

I think you mean "can be integrated properly." There's no way for you
to make sure that it actually is.

> I write more details, including associated questions, for each one of
> these goals in 3 different sections of my note below.
>
> I'm looking for the best and/or most-accepted ways to solve these
> problems.  I've done a little research (coupled with my many years of
> user/developer experience with similar systems facing the same
> problems), and the following notes reflect what I've come up with thus
> far.
>
> For what it's worth: our project has taken extreme care to create
> portable and modular C++ code and to use only the most-common and
> highly-portable libraries.  We feel that our problem lies not in
> making our code more portable, but rather making our build process and
> our binary-distribution systems more portable.
>
> ------------------------------------------------------------------
> ---- 1) Ensure the source packages can build on all systems ------
> ------------------------------------------------------------------
>
> The target stakeholder for this problem seems to be one of:
>
> a) user looking to build binaries from source, or
> b) a developer looking to modify the source for some reason, or
> c) some combination thereof.
>
> While this stakeholder set may be more knowledgeable and experienced
> then a general "binary-only" user, I still believe an automated system
> needs to check the build environment to make sure it's suitable to
> build my software package.
>
> This task appears to be centered on 2 basic issues:
>
> * Check that a proper compiler-and-linker system exists with all the
> appropriate system headers and libraries
> * Check availability and compatibility of "external" libraries and
> their headers
>
> Am I missing anything here?

Yes, about a million things. Fortunately autoconf will remember them
for you.

>
> For what it's worth: the external libraries (besides "standard" system
> libraries) that we currently use include but may not be limited to the
> following: various Boost-C++ libraries, OpenSSL, BZip2, libpqxx,
> Xerces-c, and optionally ACE
> (<http://www.cs.wustl.edu/~schmidt/ACE.html>), libcurl, and xmlrpc-c.
>
> Up until now, we were including each library (and its associated
> header files) from the above toolset in a Subversion-controlled
> "external" directory for each platform (eg, one for MinGW, Fedora,
> RHEL, Debian, etc).  However, we're finding that some boost libs don't
> work for all debian3.1 systems, etc.  I suspect we are going to run
> into this problem more and more over time, and as such we need to let
> system in question provide the library that's compatible with said
> system (in the aforementioned case: let Debian's apt download/build
> the right boost library).  Is my understanding correct?

Yes, this seems to be the only way forward for this kind of problem,
which I also face.

>
> My project also uses an extensive and modular GNU-make Makefile system
> based on a core Makefile we authored that builds rules dynamically (by
> heavily leveraging the $(eval) function in make) based up on
> per-application "input" Makefiles.  Further, we do not hard-code lists
> of source files in our Makefiles, for we auto-find the source files in
> each application or library subdirectory on the fly; therefore, when
> one adds or removes source-code files to our repos (we use Subversion,
> although it may not matter that much), we require no changes to the
> Makefiles.  This system has served us well, and we're not inclined to
> move away from this system unless absolutely necessary.  (And if we
> want to support single-source control of all build processes, even
> with non-MinGW Windows systems, we might have to move away from this
> to something like bakefile or CMake...but more on this in a minute.)

No, switch to automake.

It's a lot easier to deal with, and you can say goodbye to makefile
programming for ever. And I, for one, don't miss it! 

It's a thousand times easier to write and maintain an automake file.

>
> However, I doubt the Makefile system will be robust enough to handle
> the nuances of truly cross-platform builds; maybe that's an

You bet! Nor will every user have GNU make.

> understatement.  The tried-and-true tool to address this seems to be
> autoconf, and I'm currently gearing myself up to author some
> autoconf-based control files.  However, autoconf does not appear to
> address non-MinGW Windows environments.  For that reason, my project

It does actually. Works with visual studio I believe. But I have not
tried to get this working. (I distribute a set of visual studio
solution files for windows, but that's a real pain.)

> is currently supporting only MinGW in Windows environments.  However,
> I'd like to be able to "single-source control" the build process for
> non-autoconf-supported systems like VisualStudio systems (and to a
> lesser extent CodeBlocks, Dev-C++, etc...although they are a safer bet
> to read GNUmake Makefiles) in the future.  bakefile is the only thing
> that I've seen that yet supports this approach.  CMake (CMake?) might,
> but I'm not sure about VStudio; further, CMake requires that all my
> developer-users change their usage patterns (from './configure && make
> && make install') and to build and use CMake...and I'm not yet
> inclined to change this paradigm.

Use automake. This is exactly what it is good for.

>
> A note about autoconf:  I'm hoping it provides a *supplement* to my
> existing Makefile system, instead of replacing such system with new,
> auto-generated makefiles, etc.  (For this and possibly other reasons
> I'm steering clear of using automake, as per experiences like these:
> <http://subversion.tigris.org/hacking.html#configury>).  I'm a
> control

Do you want to solve the problem or what? Learn to let go.

> freak about my build-control process, and I don't want some automated
> tool specifying what my build rules and dependencies are.  Rather, I

Then you are in a world of hurt trying to build on *all* systems.

> want autoconf (or some tool that replaces it) to simply make sure that
> the build environment (on said machine) is sufficient and then set the
> make variables accordingly as inputs to my existing make/Makefile
> process.  Is this the way it works...or at least can work...with
> autoconf?  Another way to ask this: can autoconf essentially be made a
> "slave" to the Makefile.in file?
>

It can work that way, but your missing the point. And most of the
benefits.

Sorry, no time for the rest of your long message.

Best thing for you to do is completely replace your makefiles. Believe
me, you won't miss them.

I did this with a major C/C++/F77/F90 library and utility package, and
it took a long time and a lot of work. But it was well worth
it. Maintaining the old makefiles was a significant problem.

Now I am much happier, my build system is far more robust, and adding
a new system is no work or very little work. The autotools way is hard
at first but scales very well. The hand-written makefile is even
harder to write, and scales poorly.

Ed
-- 
Ed Hartnett  -- address@hidden





reply via email to

[Prev in Thread] Current Thread [Next in Thread]