|
From: | Julian Foad |
Subject: | Re: [bug-grep] length of dec. representation of a number |
Date: | Thu, 24 Feb 2005 22:57:50 +0000 |
User-agent: | Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8a6) Gecko/20050111 |
Paul Eggert wrote:
How about if you use the standard gnulib macro for this instead of reinventing the wheel? Here it is, taken from inttostr.h: /* Upper bound on the string length of an integer converted to string. 302 / 1000 is ceil (log10 (2.0)). Subtract 1 for the sign bit; add 1 for integer division truncation; add 1 more for a minus sign. */ #define INT_STRLEN_BOUND(t) ((sizeof (t) * CHAR_BIT - 1) * 302 / 1000 + 2)
We can't #include inttostr.h because (AFAIK) it's not standard in ANSI/ISO C, but we can certainly copy the macro definition. That would be sensible, to avoid using a different name and a different definition.
Thanks for explaining the relevance of CHAR_BIT, which is that it gives the units of "sizeof (t)". I was thinking that it could only refer to the size of the characters that hold the ASCII output of the conversion.
- Julian
[Prev in Thread] | Current Thread | [Next in Thread] |