Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 (Tek) 9/28/84 based on 9/17/84; site tektronix.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!tektronix!stever
From: stever@tektronix.UUCP (Steven D. Rogers)
Newsgroups: net.lang.f77
Subject: 64k limit
Message-ID: <5466@tektronix.UUCP>
Date: Tue, 2-Jul-85 13:58:32 EDT
Article-I.D.: tektroni.5466
Posted: Tue Jul  2 13:58:32 1985
Date-Received: Thu, 4-Jul-85 04:15:50 EDT
Organization: Tektronix, Beaverton OR
Lines: 20

MS-DOS FORTRAN 3.2 and above, Lahey F77, and probably IBM
FORTRAN 2.0, and IBM Professional FORTRAN handle larger
models for source and data

i understand that by being clever one can use about 128K with
hardly any degrading of performance

in regards to performance, some data derived from many
runs of a discrete simulation package suggests the 
following rules of thumb:

AT performance is 3x of PC
math co-processor increases performance by a factor of 5
using memory larger than 64k for source or data decreases performance
   by half


will be glad to hear from anyone else whose opinions, rules of
thumb, or experience is different