Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: $Revision: 1.6.2.16 $; site ada-uts.UUCP
Path: utzoo!linus!philabs!cmcl2!harvard!think!ada-uts!richw
From: richw@ada-uts.UUCP
Newsgroups: net.lang
Subject: Re: Re: Efficiency of Languages (and com
Message-ID: <15100015@ada-uts.UUCP>
Date: Wed, 6-Nov-85 12:36:00 EST
Article-I.D.: ada-uts.15100015
Posted: Wed Nov  6 12:36:00 1985
Date-Received: Sat, 9-Nov-85 06:31:57 EST
References: <189@opus.UUCP>
Lines: 78
Nf-ID: #R:opus:-18900:ada-uts:15100015:000:3819
Nf-From: ada-uts!richw    Nov  6 12:36:00 1985


>>> I believe that one can find the N-th highest number in an
>>> unsorted list in less than O(n log n) time (I'd have to check
>>> my algorithms books to make sure, but...)  Throw n different
>>> processors at an unsorted array, one looking for the first
>>> highest, the second looking for the second highest, etc.
>>> Have these processors work concurrently.  Voila!  (me)

> NO!  You cannot throw "n different processors" at the array!  N is
> (potentially) larger than the number of processors you have.
> (Dick Dunn)

As pointed out in another reply, it is perfectly reasonable to assume
that N processors will always be available.  Consider a machine which
had limited memory (they ALL do).  Say M bytes.  It also has M
processors.  Come up with a problem instance that has more elements
to sort than processors -- you can't.  And let's not get into the
different subject of external sorts, please.  Of course no machine
has an unbounded number of processors.  HOWEVER, given an instance of
a sorting problem, a machine can exist which can 1) store the array
to sort and 2) match each array element with a processor.
For those machines, sorting can be done in O(n).  And, yes, a single
processor can find the k-th element in an unsorted list of n numbers
in O(n).

In analyzing algorithms, no one cares that REAL machines have bounded
amounts of memory because machines have enough memory to not worry
about this detail.  Why concern yourself with the fact that no machine
can have an infinite number of processors?  This discussion is
relevent because it seems that in the future machines will exist
which have a sufficiently large number of processors that one can,
once again, ignore the finite-ness of the universe.  Details,
details...  :-)


> Actually,
> there is an assumption in analyzing algorithms that one does not have an
> infinite number of computational elements (whatever they may be)...
> (Dick Dunn)

This assumption is ONLY applicable when COMPARING algorithms.
If you want to analyze the performance of an algorithm on a new
machine you just built (or hypothesized), there's NO reason not to.
The "assumption" Dick mentions I think applies to published
analyses of algorithms; here, most readers should assume the good ol'
random-access machine, single processor architecture for the sake
of COMPARISON against other published results.

>>> Look, I'm no algorithms expert.  But separation of language
>>> issues from architectural issues from implementation issues
>>> is RARELY done and I'd like to see people be more careful
>>> about doing so.  (me)

> On the contrary, analysis of complexity of algorithms is generally done
> with considerable care to separate these issues and identify the
> assumptions about architecture implied by the algorithm.  If you're not
> accustomed to reading in the area, you may find yourself a little befuddled
> because you don't understand some of the assumptions commonly left
> implicit.
> (Dick Dunn)

Grrr...  My saying "I'm no algorithms expert" was a poor attempt at
humility.  Please avoid using such statements to imply lack of
experience.  My saying "so-and-so is rarely done" applied to
some of the off-the-cuff remarks I've seen posted.  Wasn't this
obvious?

I realize that algorithms research, language research, etc. is done
seriously (somewhere out there).  But, then again, this is not MIT's
Lab. for Computer Science (notice I didn't say the AI-Lab  :-) )
-- this is the net.  Informal discussions about various topics occur
all the time on the net and in real life.  Sometimes these discussions
include statements which I feel are wrong.  So I say so.  I truly welcome
being corrected when I'm wrong.  I'd like to learn how I'm mistaken.
But please no more bullshit about not being "accustomed to reading
in the area."

-- Rich Wagner