Path: utzoo!attcan!uunet!husc6!psuvax1!rutgers!iuvax!pur-ee!a.cs.uiuc.edu!m.cs.uiuc.edu!robison
From: robison@m.cs.uiuc.edu
Newsgroups: comp.lang.misc
Subject: Random permutation algorithm
Message-ID: <5200012@m.cs.uiuc.edu>
Date: 16 Aug 88 19:55:00 GMT
Lines: 28
Nf-ID: #N:m.cs.uiuc.edu:5200012:000:1133
Nf-From: m.cs.uiuc.edu!robison    Aug 16 14:55:00 1988


Though the following appears to be a math problem, I'm posting in comp.lang.misc
because the problem was stated in the context of imperative languages vs.
applicative languages.  [1] says that the fastest imperative algorithm takes
O(n) time, while the fastest applicative algorithm takes O(n log n) time.

    Problem: Random Permutation

    Description: Given n and a sequence of n integers drawn independently
                 from a uniform distribution over 1..n, produce a uniformly
                 random permutation of the sequence 1..n.

My question is: Am I either misreading the question?  It would seem to
be impossible.  There are n^n possible inputs, and n! possible outputs.  
In general n! does not divide n^n evenly.  Can anyone clarify this, or 
put me in contact with the authors?

[1] Carl G. Ponder, Patrick C. McGeer, and Antony P-C. Ng,
    "Are applicative languages inefficient?", SIGPLAN Notices,
    vol. 23,6, June 1988, 135-139.


Arch D. Robison
University of Illinois at Urbana-Champaign
	
CSNET: robison@UIUC.CSNET
UUCP: {pur-ee,convex}!uiucdcs!robison
ARPA: robison@CS.UIUC.EDU (robison@UIUC.ARPA)