Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.3 4.3bsd-beta 6/6/85; site gitpyr.UUCP Path: utzoo!linus!gatech!gitpyr!robert From: robert@gitpyr.UUCP (Robert Viduya) Newsgroups: net.lang.c Subject: Re: int16, int32 Message-ID: <650@gitpyr.UUCP> Date: Tue, 13-Aug-85 02:39:18 EDT Article-I.D.: gitpyr.650 Posted: Tue Aug 13 02:39:18 1985 Date-Received: Wed, 14-Aug-85 02:47:09 EDT References: <541@brl-tgr.ARPA> <1167@umcp-cs.UUCP> <384@uwmcsd1.UUCP> Organization: Georgia Tech, Atlanta Lines: 33 Summary: wrong assumptions on basic type sizes. In article <384@uwmcsd1.UUCP>, jgd@uwmcsd1.UUCP (John G Dobnick) writes: > > Um, excuse me folks, but this discussion is getting very machine dependent. > Everyone seems to be assuming, at least tacitly, that "ints" come in > only two sizes: 16 bits and 32 bits. > > I explicitly wish to point out that we run a UNIX implementation on a > machine that uses the following: > > short int: 18 bits > long int: 36 bits > I'd like to further point out that one of the machines I have access to has the following: char: 8 bits short: 32 bits int: 64 bits long: 64 bits The machine (Control Data Cyber 180/855) doesn't support 16-bit arithmetic very well, which is why shorts aren't 16-bits, ints aren't 32 bits and longs aren't 64 bits. robert -- Robert Viduya 01111000 Georgia Institute of Technology UUCP: {akgua,allegra,amd,hplabs,ihnp4,masscomp,ut-ngp}!gatech!gitpyr!robert {rlgvax,sb1,uf-cgrl,unmvax,ut-sally}!gatech!gitpyr!robert BITNET: CCOPRRV @ GITVM1