Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Path: utzoo!utgpu!water!watmath!orchid!rbutterworth
From: rbutterworth@orchid.UUCP
Newsgroups: comp.lang.c
Subject: Re: Writing readable code (what is NULL)
Message-ID: <9695@orchid.UUCP>
Date: Sun, 12-Jul-87 21:43:42 EDT
Article-I.D.: orchid.9695
Posted: Sun Jul 12 21:43:42 1987
Date-Received: Mon, 13-Jul-87 04:26:53 EDT
References: <8249@brl-adm.ARPA> <17466@amdcad.AMD.COM>
Organization: U of Waterloo, Ontario
Lines: 50

In article <17466@amdcad.AMD.COM>, tim@amdcad.AMD.COM (Tim Olson) writes:
> In article <8249@brl-adm.ARPA> Leisner.Henr@Xerox.COM (marty) writes:
> >#define NULL	(char *) 0
> AARRRGH!!!  *PLEASE* people -- we've been over this time after time:
> 1)	#define NULL 0

This has got to be the most discussed topic in this news group.
Every few months it comes up again and again.  Dozens of articles
state one thing, dozens of others say something else, and eventually
it always boils down to the fact that "#define NULL 0" is always
correct, "#define NULL ((void*)0)" is (unfortunately) acceptable on
some compilers, and everything else is wrong.

Considering that this is an open-and-shut case of a simple fact in the
language, isn't it amazing how much it is discussed and argued about?

Clearly there is something wrong with the concept of NULL, or there
wouldn't be such confusion.

NULL was introduced as an attempt at making the use of the literal 0
a little more obvious.  e.g. in "p=NULL;" vs. "p=0;", the use of NULL
indicates that a null-pointer is intended and so reminds the reader
that "p" is a pointer of some kind and not an integer.  But note that
by "indicates" I mean that it indicates it to a human being reading
the code, not to the compiler.  It is simply a documentation device;
it has no such special meaning in the language and to the compiler it
is simply treated the same as the literal 0.

To me, all this confusion indicates that perhaps it was a mistake
to have ever defined NULL in the first place.  Surely the number of
people that have been fooled by this simple device far exceeds the
usefulness that was originally intended.

If I were god, when I wrote K&R, I think I'd either build NULL into
the language, or I would use the following definition of NULL:
#define NULL(type) ((type)0)
In the former case "p=NULL" would work fine, but "func(NULL)" would
produce an error since there is no type for it to be coerced to.
In the latter case, the user would be forced to code "p=NULL(char*);"
or "func(NULL(int*));".

Either way there wouldn't be any of this confusion that we have now,
and will probably always have.  (Actually with ANSI officially supporting
"(void*)0", things will probably be even more confusing than they ever
were.)

I for one will never use NULL in anything I write.  I'm sure no one
could have forseen this when it was first introduced, but in retrospect
it really is a half-assed solution that has caused far more problems
than it has solved.