Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site rochester.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!think!harvard!seismo!rochester!sher
From: sher@rochester.UUCP (David Sher)
Newsgroups: net.lang
Subject: Re: Efficiency of Languages (and complexity)
Message-ID: <12890@rochester.UUCP>
Date: Tue, 5-Nov-85 08:58:40 EST
Article-I.D.: rocheste.12890
Posted: Tue Nov  5 08:58:40 1985
Date-Received: Thu, 7-Nov-85 06:30:18 EST
References: <15100004@ada-uts.UUCP> <15100007@ada-uts.UUCP> <189@opus.UUCP> <12860@rochester.UUCP> <196@opus.UUCP>
Reply-To: sher@rochester.UUCP (David Sher)
Organization: U. of Rochester, CS Dept.
Lines: 60
Keywords: O notation infinity hardware algorithms

In article <196@opus.UUCP> rcd@opus.UUCP (Dick Dunn) writes:
>Talking about sorting and complexity of algorithms...
>> In article <189@opus.UUCP> rcd@opus.UUCP (Dick Dunn) writes:
>>  ...
>> >NO!  You cannot throw "n different processors" at the array!  N is
>> >(potentially) larger than the number of processors you have...
>> ...
>> I am afraid that this opinion is an oversimplification...it is 
>> perfectly reasonable to study the case where the amount of hardware
>> available is proportional to size of the problem or some function thereof
>> (like log(n)).  This is no more an assumption of infinite hardware
>> than O(n) assumes that you have infinite time...
>
>OK, now I am SURE that the folks who are responding to this don't
>understand O() notation.  This notation is explicitly designed to express
>the ASYMPTOTIC COMPLEXITY of an algorithm.  If that's not what you mean,
>use some other notation.  The usefulness of this sort of complexity measure
>is that it tells how the required processing time (or space) increases in
>relation to the "size" of the problem being solved.  Again, O() notation is
>an asymptotic measure.  This makes it quite explicit that you cannot have
>an "...amount of hardware...proportional to size of the problem..." unless
>you consider an infinite amount of hardware.
>
>The "order-of" notation is not the only measure that can be applied to an
>algorithm--but it DOES have a specific meaning and a specific use, and it's
>a very basic part of analysis of algorithms.  Try to understand it before
>you use it.
>-- 
>Dick Dunn	{hao,ucbvax,allegra}!nbires!rcd		(303)444-5710 x3086
>   ...Never attribute to malice what can be adequately explained by stupidity.

Are you sure you understand O(f(n)) notation?  
The exact definition of O(f(n)) is:
"a function g(n) is said to be O(f(n)) if there exists a constant c
such that g(n) <= cf(n) for all but some finite (possibly empty set of 
nonnegative values for n." (Taken w/o permission from Aho, Hopcroft,
and Ullman's "The Design and Analysis of Computer Algorithms") (Actually
I can define it also axiomatically but this gets the gist of it and
should be understandable even to the nonmathematicians in the audience.)
Actually i suspect your misunderstanding is of the word infinite.  
O(f(n)) never refers to any of the infinities.  What it refers to is
an arbitrarily large number (which is a subtly different concept).
Thus I can consider an arbitrarily large amount of hardware without
considerring an infinite amount.  This just means that if you are
willing to supply me with a problem n words long I am willing to
construct some unknown c times f(n) hardware (c must remain constant
for the rest of my life) and run the problem on it (for all but a finite
set of special cases. In particular there is a constant d such that if you hand
me a problem larger than d, I can definitely do it within cf(n)).  
Since I don't expect
you to hand me any infinitely long problems I do not expect to construct
infinite amounts of hardware.

-David Sher
"Never assume that someone else is stupid when your own stupidity is
an equally valid explanation"
-- 
-David Sher
sher@rochester
seismo!rochester!sher