Path: utzoo!attcan!uunet!peregrine!ccicpg!cci632!rit!tropix!moscom!ur-valhalla!uhura.cc.rochester.edu!sunybcs!rutgers!apple!bloom-beacon!tut.cis.ohio-state.edu!husc6!rice!uw-beaver!uw-june!uw-entropy!mica!charlie
From: charlie@mica.stat.washington.edu (Charlie Geyer)
Newsgroups: comp.lang.c
Subject: Re: What's a C expert?
Message-ID: <1528@uw-entropy.ms.washington.edu>
Date: 22 Jul 89 23:29:58 GMT
References: <12214@well.UUCP> <6057@microsoft.UUCP> <4722@alvin.mcnc.org> <25999@amdcad.AMD.COM> <4724@alvin.mcnc.org>
Sender: news@uw-entropy.ms.washington.edu
Reply-To: charlie@mica.stat.washington.edu (Charlie Geyer)
Distribution: all
Organization: UW Statistics, Seattle
Lines: 22
Summary:
Expires:
Sender:
Followup-To:


In article <25999@amdcad.AMD.COM> tim@amd.com (Tim Olson) writes:

> Having the sign of chars be undefined allows the implementation to be as
> efficient as possible with respect to converting between chars and ints.

In article <4724@alvin.mcnc.org> spl@mcnc.org.UUCP (Steve Lamont) replies:

> Huh?  Are you telling us that the standard *allows* such a horrible
> thing?  Aaaaaaarrrrrgh!  :-+ (<-- smiley sucking on a persimmon)  I
> thought the standard was supposed to clarify things, not confuse the
> issue.  It's almost like saying that a declaration of int may be either
> signed or unsigned.  Makes for somewhat unpredictable behavior and/or
> some fairly verbose defensive coding...

C is "hardware friendly."

The standard has been written by some very clever people to allow C to
be implemented as efficiently as possible on almost any hardware no
matter how brain damaged.

It's not a bug, it's a feature.