Path: utzoo!utgpu!water!watmath!clyde!bellcore!rutgers!mailrus!ames!umd5!brl-adm!adm!dsill@nswc-oas.arpa
From: dsill@nswc-oas.arpa (Dave Sill)
Newsgroups: comp.lang.c
Subject: Re: volatile
Message-ID: <15422@brl-adm.ARPA>
Date: 31 May 88 14:04:34 GMT
Sender: news@brl-adm.ARPA
Lines: 38

From: Steven Ryan 
>Enormously expensive refers quadratic, cubic, quadric, or higher order time.
>Best possible optimisation uses exponential time.  (There are few transitive
>closures involved, graph colourring, et cetera.)
>
>What is involved is spending a few days to compile a large system.
>
>Some of us have customers who notice if compilation times increases by a
>millisecond.

What takes hours today will probably only take minutes tomorrow.
Anyway, compilation time is secondary to run-time once development is
complete.

>Also, please do not assume the Church-Turing Hypothesis.

Huh?  I think it's quite safe to assume the Church-Turing *Thesis*.
And what does it have to do with optimization, anyway?

>The point being, do not expect magic from a compiler. It can provide at best
>linear improvement.

Since when?  (And what's wrong with linear improvement?)  What about a
compiler that recognizes certain polynomial algorithms that can be
done linearly?  Or a compiler that recognizes the Sieve of
Eratosthenes in its standard benchmark form and simply outputs the
correct count?  Or a compiler that checks to see if a program has no
input (the output is the same every time it's run), determines what it
does by running it once, and produces code to simply generate the
correct output?  Sure, compile times could be humongous when
optimization is enabled, but run times would be next to nil.

=========
The opinions expressed above are mine.

"We must remove the TV-induced stupor that lies like a fog across the
 land."
					-- Ted Nelson