Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.1 6/24/83; site mit-eddie.UUCP Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!smh From: smh@mit-eddie.UUCP (Steven M. Haflich) Newsgroups: net.lang.c Subject: Re: $ in identifiers -- poll Message-ID: <3281@mit-eddie.UUCP> Date: Sun, 9-Dec-84 08:15:10 EST Article-I.D.: mit-eddi.3281 Posted: Sun Dec 9 08:15:10 1984 Date-Received: Mon, 10-Dec-84 03:22:45 EST References: <3@aeolus.UUCP> <260@sftri.UUCP> Reply-To: smh@mit-eddie.UUCP (Steven M. Haflich) Organization: MIT, Cambridge, MA Lines: 22 Another very ugly but very practical reason for not allowing additional alphameric characters in identifiers is portability. Regardless what the C Standard eventually says, not all machine/OS combinations support all C-ASCII characters in identifiers (especially externals), and some support non-C-ASCII characters. There is little a standards committee or net.lang.c can do about this [except, I suppose, flame :-)]. My intuition is that languages and OS/linkers most commonly allow exactly *one* legal nonalphameric character in identifiers, and this character is most often overloaded as an informal package flag: `_' as an external prefix char in Unix/C, `$' in lots of Big Blue systems, etc. When porting code either direction, the simple one-to-one mapping of these characters saves a lot of grief. Let's not make it tougher to *ex*port Unix/C to other systems by trying to make it very occasionally easier to *im*port foreign code. It is my opinion, by the way, that the traditional availability of these informal package-flag chars inside identifiers was a portability botch, mostly impeding exporting code, not importing. But it is only recently that vendors and their captive language designers have come to realize that exportability of code can sell machines just as well as importability.