Path: utzoo!mnetor!uunet!seismo!sundc!pitstop!sun!decwrl!curie.dec.com!vantreeck
From: vantreeck@curie.dec.com
Newsgroups: comp.sys.mac
Subject: C and large data model
Message-ID: <8712152132.AA10832@decwrl.dec.com>
Date: 15 Dec 87 21:32:24 GMT
Organization: Digital Equipment Corporation
Lines: 76

>>Can anyone tell me the relative merits/demerits of Aztec C (or some other
>>inexpensive development environment) vs. LightSpeed on the Mac ][.
>>
>>LightSpeed is what everyone seems to recommend, but Aztec says it has a
>>symbolic debugger (any good? I'm used to codeview on a PC and dbx on UNIX)
>>make (does LSC have this?) and unlimited data size (this is important as
>>I hope to port a prolog compiler from UNIX eventually, does LSC let you
>>have huge memory spaces? - I want to have as much of my 5 megs for heap
>>as possible)

Funny you should mention "PROLOG" and large data model. I wrote a PROLOG
compiler for the Mac using a mixture of C and assembler. And I used the large
data model. I used Manx's Aztec C compiler and assembler. It is THE ONLY
compiler available on the Mac that supports both the large code and large data
models. Alas, I can't share my PROLOG with anyone because my employer says
Apple is competitor and any software on a competitors hardware makes it more
viable competition, i.e., conflict of interest (translates to don't bother
asking for it). 
 
>	I've heard that the Aztec symbolic debugger doesn't exist, but
>I've never seen it, so I can't be more definite than that.

Aztec has had a VERY NICE symbolic debugger, db, for a long time. Aztec C
provides compiler and link options to also generate symbol tables that can be
used by other debuggers, e.g., MacsBug and Tmon. Manx is developing a source
line debugger, sdb. 

>	LightspeedC has *more* than make. It's got full automatic project
>management, which means that you'll never have to write a makefile.

LightSpeed C also has *less* than Aztec's make if your complete system contains
modules written in other languages. If you wish to also rebuild C, Pascal, and
assembly language modules into a program, Aztec's make is the preferred tool.
Last time I checked, LightSpeed C required all source files in project to be in
one directory. Aztec's make does not have that limitation, e.g., I have sources
to built-in predictes in one director, unification engine in another directory,
and database routines in yet another, etc.. For larger development projects you
will probably have sources in multiple directories and have complicated
dependencies that an automated make, like LightSpeed's, can't handle. In that
case, you will no choice but to use a standard make facility.

Note, that if you're porting a PROLOG compiler based on the WAM (Warren's
Abstract Machine), you will probably use assembly language for the engine so
that you can specify global register allocation (means at least a doubling of
performance for PROLOG). And you would also prefer to directly access data
without having to index off the A5 register (using that register instead for a
globally allocated PROLOG "A/X" register), i.e., you'll probably want to use
Aztec C's large data model and a mult-language support of Aztec's make. 

[Contrary to what some might think, I found programs compiled for the large
data model generated slightly smaller object files and ran slightly faster.]

With LightSpeed C you have to leave their environment to build non-C modules.
You don't have to leave the Aztec C shell when building non-C modules.
LightSpeed C makes no attempt to integrate non-Think products into it's
environment. Manx makes a good effort at having an open interface that lets you
integrate other tools into the environment. As an example, clicking on the
"Edit" item in the menu brings up my QUED editor instead of the default Aztec
editor, z (which is like vi). 

If you're more interested in quick prototyping than careful up-front
design (and you're only using C), then LightSpeed C will be particularly
appealing because of the quick turn-around time.

Aztec C and LightSpeed C are in the same ball park with respect compilation and
link times, code size, and run-time speed. Neither is too hot a code
optimization, e.g., neither does common subexpression elimantion, loop jamming,
dead code elimination, etc.. I hear Green Hill's C compiler that works with
Apple's MPW does an excellent job of optimization -- but does not support the
large data nor large code models. 
 
**The opinions stated herein are my own opinions and do not necessarily
represent the policies or opinions of my employer (Digital Equipment Corp.)
 
George Van Treeck
Digital Equipment Corporation (The home of industrial-strength software tools.)