Path: utzoo!attcan!uunet!auspex!guy From: guy@auspex.UUCP (Guy Harris) Newsgroups: comp.lang.c Subject: Re: signed/unsigned char/short/int/long [was: #defines with parameters] Message-ID: <529@auspex.UUCP> Date: 28 Nov 88 17:31:59 GMT References: <264@aber-cs.UUCP> <8982@smoke.BRL.MIL> <8983@smoke.BRL.MIL> <277@aber-cs.UUCP> Reply-To: guy@auspex.UUCP (Guy Harris) Distribution: eunet,world Organization: Auspex Systems, Santa Clara Lines: 59 >What I really was unhappy with in dpANS (among many other things...) is >that signed has been introduced as a new keyword, and I ought to have >said unsigned/signed, not unsigned/int. The reason why it was introduced has nothing to do with X3J11's opinion on whether "int" and "unsigned int" are similar types or not. It was introduced solely to provide a way for a programmer to specify that a "char"-sized variable is to be signed; presumably, it applies to "short", "int", and "long" purely for orthogonality. >I would not agree that char has never been a length specifier for int; The only way you can validly disagree with that statement is if you can produce a compiler wherein "char int" is a valid type specifier. I have yet to see any such compiler. >In Johnson's compiler instead one can fully use char,short,long as type >modifiers, and int and unsigned as based types, Funny, the SunOS C compiler is based on Johnson's PCC, and when I try to compile foo() { char int x; } it complains about an "illegal type modification". That compiler, at least, sure doesn't let you use "char" as a type modifier.... >and I regard this (in my never humble opinion) as the most natural >thing to do, and one that would not have required the introduction of >signed. I don't regard it as the most natural thing to do. I would have preferred it had C made a distinction between: 1) a type that was the unit of storage allocation; 2) a type that was a "character", and that might require a transfer function to convert between it and integral types (big deal, the main loop of a character-string-to-decimal routine might have included n = n*10 + (int(c) - int('0')); instead of n = n*10 + (c - '0'); even if the language didn't specify an automatic character-to-int conversion here, people could have lived with it); 3) a very small integral type. For various reasons, including the crudely practical one of the behavior of debuggers when confronted with "char", I tend to think of characters and integral types as different sorts of beasts.