Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Path: utzoo!mnetor!seismo!ll-xn!ames!ucla-cs!pismo!marc
From: marc@pismo.cs.ucla.edu (Marc Kriguer)
Newsgroups: comp.lang.c
Subject: Re: NULL, zero, and readable code
Message-ID: <7040@shemp.UCLA.EDU>
Date: Mon, 6-Jul-87 05:17:33 EDT
Article-I.D.: shemp.7040
Posted: Mon Jul  6 05:17:33 1987
Date-Received: Tue, 7-Jul-87 01:22:24 EDT
References: <8170@brl-adm.ARPA>
Sender: root@CS.UCLA.EDU
Reply-To: marc@pismo (Marc Kriguer)
Organization: UCLA Computer Science Department
Lines: 23

In article <8170@brl-adm.ARPA> bdm-thad@Walker-EMH.arpa writes:
>
>Re: NULL vs. zero and readable code
> 
>I think the problem here is the definition of NULL.  NULL is not, repeat,
>NOT, equal to zero, at least the base ten zero.

0 in base 10 = 0000000 (ten), which is equal to 0000000 in ANY base.

>ASCII in fact defines them differently:  NULL is hex 0 while zero is hex
>30. Therefore, stdio.h should define NULL as 0x0, not 0 which would be
>0x30.

No.  ASCII defines the character code for the CHARACTER '0' to be 0x30,
but that is NOT saying that zero is hex 30.  Just the CHARACTER.  When you
#define NULL 0
	you get 0, not '0'.
Thus NULL is being defined as 0  [or (char *) 0, if you prefer], not 0x30.
 _  _  _                        Marc Kriguer
/ \/ \/ \
  /  /  / __     ___   __       BITNET: REMARCK@UCLASSCF
 /  /  / /  \   /  /  /         ARPA:   marc@pic.ucla.edu
/  /  /  \__/\_/   \_/\__/      UUCP:   {backbones}!ucla-cs!pic.ucla.edu!marc