Path: utzoo!telly!ddsw1!mcdchg!rutgers!tut.cis.ohio-state.edu!gatech!bbn!bbn.com!rgoguen
From: rgoguen@bbn.com (Robert J Goguen)
Newsgroups: gnu.gcc.bug
Subject: bug in handling bit patterns
Keywords: Bit patterns
Message-ID: <32930@bbn.COM>
Date: 1 Dec 88 15:50:59 GMT
Sender: news@bbn.COM
Lines: 128

I'm building a port to the 3B2. The memory of my machine is as follows.



Increasing addresses ------------------------>>>>>>>>

3
1                                             7 6 5 4 3 2 1 0
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
              |               |              |              |
  HIGH WORD   |               |              |  LOW WORD    |
              |               |              |              |
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
m                                                           l
s                                                           s
b                                                           b

I have BITS_BIG_ENDIAN, BYTES_BIG_ENDIAN and WORDS_BIG_ENDIAN defined.

The most significant bit in a byte is lowest numbered, the most significant
byte in a word is lowest numbered and the most significant word in a multi
word is lowest numbered, so the above defines are right.

	Gcc produces wrong rtl when given the following source code:

struct foo {
     int twobit:2;
     int       :1;
     int threebit:3;
     int onebit:1;
   }

main()
{
   struct foo s3 ;

   s3.onebit = 1;
   printf("s3.onebit = %d\n",s3.onebit);
}

using gcc -dg -S tst.c with the above source has tst.c.greg output is as 
follows :

;; Function main

;; Register dispositions: 16 in 0 

;; Hard regs used:  0 10 12

(note 1 0 2 "" -1)

(note 2 1 3 "" -2)

(insn 3 2 4 (set (mem/s:QI (plus:SI (reg:SI 10)
               (const_int 3)))
       (ior:QI (mem/s:QI (plus:SI (reg:SI 10)
                   (const_int 3)))
           (const_int 2))) 61 (nil)	<<<