Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: notesfiles
Path: utzoo!linus!decvax!tektronix!hplabs!hp-pcd!orstcs!richardt
From: richardt@orstcs.UUCP (richardt)
Newsgroups: net.arch
Subject: RE: RISC (Actually 68K densities)
Message-ID: <12200008@orstcs.UUCP>
Date: Wed, 10-Jul-85 01:48:00 EDT
Article-I.D.: orstcs.12200008
Posted: Wed Jul 10 01:48:00 1985
Date-Received: Mon, 15-Jul-85 01:43:17 EDT
Organization: Oregon State University - Corvallis, OR
Lines: 24
Nf-ID: #N:orstcs:12200008:000:1351
Nf-From: orstcs!richardt    Jul  9 21:48:00 1985


As I read, I see a *lot* of comments about how "most 68xxx & 32xxx 
instructions take 32 bits anyway!"  I would like to suggest that a large
factor in this may be due to *sloppy compilers*!  I have written
68000 code as a hobbyist for about the last three years.  I have started
writing BASIC interpreters several times and a FORTH-relative once.  In
all of those cases, I found that the average instruction width 
  # of instructions / # of bits for code = ~18 bits
That's a rough estimate; It may be a little high.  When you add in data,
then things get interesting.  A program that uses a lot of predetermined
data runs up that average a lot more than you think.  When I sat down
and started to write op system routines, the average instruction was
about 20 bits.  (Note: I am using a mathematical average here.)
My point is, If most of your programs have larger average instruction
widths, maybe you've got a sloppily-written compiler.  The average 
instruction width on a 'thousand needn't be anywhere near 32 bits.
	So look at your compiler before you go running of screaming 
	'Give me a RISC.'
I'll step down off my soapbox now, and someone else can defend the National
chips.  I don't have the info (yet.)
----------------------------------------
Is there an assembly-language programmer in the house?
						orstcs!richardt
/* ---------- */