Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Path: utzoo!watmath!clyde!ima!haddock!karl From: karl@haddock.UUCP Newsgroups: comp.lang.c Subject: Re: bit-field pointers / arrays Message-ID: <194@haddock.UUCP> Date: Fri, 12-Dec-86 18:41:54 EST Article-I.D.: haddock.194 Posted: Fri Dec 12 18:41:54 1986 Date-Received: Mon, 15-Dec-86 06:25:09 EST References: <311@bms-at.UUCP> Reply-To: karl@haddock.ISC.COM.UUCP (Karl Heuer) Organization: Interactive Systems, Boston Lines: 42 In article <311@bms-at.UUCP> stuart@bms-at.UUCP (Stuart D. Gathman) writes: >It has been said that bit-field arrays are intrinsically impossible in 'C' >because there can be no pointers to bit-fields. [Suggests adding syntax for >bit pointers and bit arrays.] > >The problem comes with 'sizeof'. What is the size of a bit? How about 0.125e+0 :-) >[Options reordered for rebuttal --kwzh] >3. A third approach is to define the size of a bitfield to be the minimum >number of standard sizeof units required to store the field. This would violate the principles that p + n is "really" p + n * sizeof(*p) and that sizeof(type [N]) == N * sizeof(type). >2. Another alternative is to make sizeof char == 8, but this would cause >problems allocating huge arrays with malloc(). Only if the number of addressible bits is greater than LONG_MAX. Since, on such an architecture, you already need a super-long datatype to hold a bit pointer, you might as well also have arithmetic types of that size which can then be typedef'd to "size_t" and "ptrdiff_t" (unsigned and signed). (See X3J11 if you don't know what I'm talking about.) Then the "only" problem is dealing with the huge bulk of programs that assume sizeof(char)==1. >1. My first impulse is to disallow 'sizeof' applied to bitfields. (It is not >allowed now anyway.) I think that this would be the best initial step. It would be nice to phase out the assumption that sizeof(char)==1, but I think that (for this version of the standard) ANSI had better insist on it, though they may want to mark it as deprecated. But before they can remove this assumption, they have to go through all the functions that mention (or suggest) "bytes" and decide whether they mean "char" or "quantum of sizeof". For example, malloc() should probably expect a bit count (even though it rounds upward for alignment), but what about fread()? If it expects a bit count (which is what one would expect, given that its arg type is "size_t"), what happens if you try to fread() one bit? Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint