Path: utzoo!utgpu!water!watmath!clyde!att!osu-cis!tut.cis.ohio-state.edu!mandrill!gatech!uflorida!umd5!mimsy!chris
From: chris@mimsy.UUCP (Chris Torek)
Newsgroups: comp.lang.c
Subject: Re: Let's define our own NULL
Message-ID: <12152@mimsy.UUCP>
Date: 26 Jun 88 15:44:14 GMT
References: <160@navtech.uucp> <11326@steinmetz.ge.com>
Organization: U of Maryland, Dept. of Computer Science, Coll. Pk., MD 20742
Lines: 36

>| 		#define NULL	0

In article <11326@steinmetz.ge.com> davidsen@steinmetz.ge.com
(William E. Davidsen Jr) writes:

>  I know I'll get flamed for disagreeing with K&R, but this is WRONG.
>The original book was written before segmented archetectures were
>commonly used, and the idea of "near" and "far" pointers was not an
>issue. When defining NULL as zero, you open the possibility of errors in
>argument lists terminating with NULL, since *only* assignment will
>translate zero to a pointer.

A cast is an assignment; only incorrect programs (or certain prototype
dependent routines, under dpANSish compilers) do not cast NULL in
argument lists.

>  Better is to define NULL:
>	#define NULL	((void *) 0)
>and be sure it works. Some compilers have all sorts of tests to use zero
>or zero long, but this always leaves room for a problem in mixed mode
>programming.

Unfortunately, this still does not work without a cast.  If functions
f and g take code-space and data-space pointers respectively, then in
a large-data-space small-code-space (`medium model', I think) compiler,

	f(NULL);
	g(NULL);

where NULL is ((void *)0), will pass the wrong number of bytes to at
least one of the two functions.

The only rule needed for correct code is this:  Casts always work.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris