Path: utzoo!attcan!uunet!sco!md
From: md@sco.COM (Michael Davidson)
Newsgroups: comp.lang.c
Subject: Re: Memory Models
Keywords: Memory models,C
Message-ID: <888@fiasco.sco.COM>
Date: 15 Aug 89 01:15:20 GMT
References: <562@dcscg1.UUCP> <10703@smoke.BRL.MIL>
Reply-To: md@sco.COM (Michael Davidson)
Organization: The Santa Cruz Operation, Inc.
Lines: 18

In article <10703@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn) writes:
>In article <562@dcscg1.UUCP> drezac@dcscg1.UUCP (Duane L. Rezac) writes:
>>I am just getting into C and have a question on Memory Models.
>
>That is not a C language issue.  It's kludgery introduced specifically
>in the IBM PC environment.  Unless you have a strong reason not to,
>just always use the large memory model.  (A strong reason would be
>compatibility with an existing object library, for example.)
Sorry, but it is an evil necessity brought about by the segmented
architecture of the INTEL 8086 and 80286 - although the most common
place that these processors show up is in the IBM PC environment
this kludgery follows these CPUs wherever they go.

Actually, better advice is to always use small model (ie up to
64k code and 64k data), unless you really don't care about performance.
The cost of continually reloading segment registers (which is what
large model will tend to do) is bad in real mode and horrific in
protected mode. Just remember that small is beautiful....