|
From: | Kaelin Colclasure |
Subject: | Re: How should ObjC encode 'long'? |
Date: | Tue, 27 Jan 2004 14:37:19 -0800 |
On Jan 26, 2004, at 7:09 PM, Ziemowit Laski wrote:
Currently, ObjC encodes 'long' as 'l' (and 'unsigned long' as 'L'), but only if sizeof(long) == sizeof(int). On LP64 targets, where sizeof(long) == 2 * sizeof(int), 'long' and 'unsigned long' get encoded as 'q' and 'Q', respectively, instead.Personally, I tend to think that this is broken, and that long should always be 'l', regardless of its size. However, I can also see an ABI argument (esp. in the contextof distributed objects) that would lead to the opposite conclusion. What do you all think?
I would assume the whole point of "encoding" is to externalize the relevant type information. And if it's for external consumption, it needs to be fully self-describing. 'q' and 'Q' unambiguously denote a signed / unsigned quadword... Seems like exactly the right thing.
-- Kaelin
Thanks, --Zem -------------------------------------------------------------- Ziemowit Laski 1 Infinite Loop, MS 301-2K Mac OS X Compiler Group Cupertino, CA USA 95014-2083 Apple Computer, Inc. +1.408.974.6229 Fax .5477 _______________________________________________ Discuss-gnustep mailing list Discuss-gnustep@gnu.org http://mail.gnu.org/mailman/listinfo/discuss-gnustep
[Prev in Thread] | Current Thread | [Next in Thread] |