[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Fwd: cross compiling newbie questions

From: Steffen Dettmer
Subject: Fwd: cross compiling newbie questions
Date: Mon, 15 Sep 2008 11:22:18 +0200


thank you for your response and help.

Eric, sorry that you get this mail twice, but accidentally I
replied to your via PM (my normal MUA `mutt' handles mailinglists
automatically so I forgot it).

I'm sorry for the horrible long big attachment in my first mail.
This is automatically appended. Finally I decided to continue
with some web mail account instead, I hope it is better.

On Sat, Sep 6, 2008 at 9:43 PM, Eric Blake <address@hidden> wrote:
> According to Steffen DETTMER on 9/2/2008 3:16 AM:
> > If I understood correctly, passing --host in general enables
> > cross compiling.
> Yes (and you should also specify --build).

I've reread autoconf `Hosts and Cross-compilation' but seems I
still do not really understand. We are running with
--build=`config.guess` (set by some wrapper script).
not that is important, but why isn't this the default?

> > My question in short:
> >
> >   How to correctly set mandatory default options like -isystem
> >   and -nostdlib -lmylibc?
> The same as for non-cross-compilation:
> ./configure CC='my-cross-cc -nostdlib' CFLAGS='-isystem path/to/dir' \
>  LIBS='-lmylibc'
> In other words, ./configure should NOT encode these defaults (because they
> are not necessarily mandatory defaults for all users), rather, the user
> should be responsible for specifying them.

So that means, the idea/concept is that a user shall be forced to
specify all defaults? Isn't this uncomfortable? When let's say he
specifies a non-gcc compiler, in practice he would need to also
pass CFLAGS (and some 20 varibles for linker, options and stuff)
with values which are logically redundant (i.e. there is not
alternative for the /default/). Also, when CFLAGS are passed (and
I understood correctly), they should be taken as specified but
not modified. Now the user needs to know that when specifying
--enable-debug what to add to CFLAGS (and all the others),
correct? In the end, the user needs to configure everything
manually and know how to do that.

I think this conflicts with the idea of autoconf, so I missed
something or missunderstood the things. Do you have some pointer
to some background information or so?

You wrote those defaults are not necessarily mandatory defaults.
But whatever default can be overwritten by setting the
appropriate variable in any case, correct? Having defaults that
cannot work (like CC=myproprietarycompiler with -ggdb or so then
just leads to a early configure fail).

> > We `defined' a name for this, lets say `arm-ingenico-elf_device'
> > (the value of `device' depends) and use this in case
> > statements to select options. Is this wrong and how would it be
> > correct?
> In general, writing platform-name-specific tests in (the name
> is obsolete)

Yes, thanks also for this point. Is there any difference (except
that the logical meaning of .in was changed from meaning `input
file' to `autoconf input file')?
Whenever a is renamed to the sandboxes
break, i.e. `make' fails because of a wrong (make) dependency, I
think neither autoreconf nor autoreconf --force is able to
recover from that. So I hope support for will last
forever :-)

> is not the autoconf way.  Rather, you should
> test for features that a platform either provides or lacks, independently
> of the platform's name.  Unfortunately, for cross-compilation, this
> doesn't always work (since you can't run test binaries, and not all
> features can be tested solely at link time), but even then, you are better
> off using only names provided in config.guess (patches to add host
> triplets to config.guess should be sent to the list mentioned in that file).

I think, for gcc and/or GNU/Linux usually platform independent
testing works well, but for others I think it often is hard.
Also, sometimes headers have functions that cannot be used and
sometimes you can even link it. I understand that this is a
problem of the used compiler suite / tool chain but not of

Often, the result then even needs to be post-processed, like
making some objcopy or putting some special headers around some
boot image, but I didn't found out how to use this correctly (we
use some_SCRIPT or so to poll some automake depdenency and
autoconf is thinking that some temporary ELF or so binary would
be the executable). Is there some howto about such questions or
are they considered out of scope?

> > How to check compilers/environments with mandatory parameters?
> I still think that the easiest way is writing your cross-compiler in such
> a way that you don't need those parameters to be mandatory.  In other
> words, write a wrapper script that does all the work:
> #!/bin/sh
> exec real-cross-cc -isystem path/to/dir "$@"
> and install it on PATH ahead of the cross-compiler.
> > Another way could be to have arm-ingenico-elf_device-gcc etc
> > being a wrapper passing -isystem etc, but I think this is
> > ugly.
> How is that ugly?  It actually sounds the cleanest to me,
> because you can then use the wrapper as your cross-compiler
> without having to worry about all the details of how to make it
> work.

This would be easy if the script would be constant, but we would
need to generate it, because the system headers directory is a
subdirectory of the sandbox but not installed into system,
because we have several versions and the exact version depends
on the project / sources. The same for libraries. We link
statically but need to have the right version. So when using PATH
we would need to adjust it in Makefile to cover some subdirectory
or using full pathes to CC (maybe something like

On linux, this would mean how to set up a project that includes
its own system-independed glibc (or whatever libc) instead of
using the one that the systems gcc would use.

A big problem is that because the user shall be able to checkout
to an arbitrary directory on linux or cygwin or mingw (which does
not work, but in theory) and it should work. The wrapper scripts
depend on --build and toplevel source and build directory and
those scripts are needed for CC=, but CC should be set before
configure is run but the scripts are created by configure. Not
sure what problems arise from that
 (we have several configures in subdirectories and a toplevel
  configure with AC_CONFIG_SUBDIRS, so the first configure would
  get an CC that does not exists yet?)

So how to configure using optionally with part of a toolchain
(e.g. libc-wannabe) in a subdirectory?

> > This message may contain confidential and/or privileged information.
> It is considered poor netiquette to post email to a publicly archived
> mailing list which contains an unenforceable disclaimer.  Some people
> refuse to reply to such mail on principle.

Yes, I understand, agree and I am sorry for that. I hope it is
better now.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]