[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: RFC: enum instead of #define for tokens

From: Paul Eggert
Subject: Re: RFC: enum instead of #define for tokens
Date: Fri, 5 Apr 2002 02:37:02 -0800 (PST)

> From: Akim Demaille <address@hidden>
> Date: 05 Apr 2002 10:53:54 +0200
> Back to your sentence: are you considering that Bison ought to chose
> the size of the output types?  Are you saying that for small grammars
> we should stick to shorts where possible, and int otherwise, instead
> of imposing ints to everyone?

Yes, that was one of my points.  Bison should choose the narrowest
integer type that does the job.

> | #ifdef      eta10
> | # define    MAXSHORT        2147483647
> | # define    MINSHORT        -2147483648
> | #else
> | # define    MAXSHORT        32767
> | # define    MINSHORT        -32768
> | #endif

The ETA-10 is an old supercomputer (circa 1987 -- cooled by liquid
nitrogen!), evidently with 32-bit 'short'.  This appears to me to be
ancient code that predates C89.  The code should be reworked to use
<limits.h>'s SHRT_MIN and SHRT_MAX instead.  Something like this:

# include <limits.h>
#ifndef SHRT_MIN
# define SHRT_MIN (-32768)
#ifndef SHRT_MAX
# define SHRT_MAX 32767

and then replace all instances of MINSHORT and MAXSHORT elsewhere in
the code.

> | #if defined (MSDOS) && !defined (__GO32__)
> | # define MAXTABLE   16383
> | #else
> | # define MAXTABLE   32767
> | #endif

That is evidently to cater to machines with 16-bit word size.  I would
remove the MSDOS etc. condition.  The GNU coding standards say "don't
make any effort to cater to the possibility that an `int' will be less
than 32 bits.  We don't support 16-bit machines in GNU."

> I suppose we should use uintmax and Co., and AC_CHECK_SIZEOF.

I don't think we need to bother with uintmax_t.  The GNU coding
standards say (do I sound like a broken record? :-) "don't make any
effort to cater to the possibility that `long' will be smaller than
predefined types like `size_t'".  The items in question all have to
fit into memory, so "long" is wide enough, and will work even on
ancient K&R hosts that lack uintmax_t.  Also, 'long' is much easier to
printf than uintmax_t is.

> Yep, I saw your post, but I have not tried.  As you sure it works?

I haven't tried it either, but if it doesn't work, Bison should be
fixed so that it does, since it's a POSIX requirement.  (I.e. it's
a code bug not a doc bug.  :-)

> Wow!  Yes, I forgot there were tables using the tokens are entry.
> Grrr, people should not rely on this `feature'.

I wasn't so much worried about people using yytranslate in their own
code (they deserve what they get; it's private).  I was worried about
how 'bison' would output a yytranslate that will work on an EBCDIC
host, even when 'bison' itself is running on an ASCII host.  This
issue is independent of the pointer-versus-int issue.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]