Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.1 6/24/83; site fortune.UUCP
Path: utzoo!watmath!clyde!burl!mgnetp!ihnp4!fortune!rpw3
From: rpw3@fortune.UUCP
Newsgroups: net.rumor
Subject: Re: A Quick Question - (nf)
Message-ID: <3663@fortune.UUCP>
Date: Fri, 22-Jun-84 04:08:48 EDT
Article-I.D.: fortune.3663
Posted: Fri Jun 22 04:08:48 1984
Date-Received: Sat, 23-Jun-84 02:43:03 EDT
Sender: notes@fortune.UUCP
Organization: Fortune Systems, Redwood City, CA
Lines: 106

#R:isrnix:-18600:fortune:9700009:000:5219
fortune!rpw3    Jun 21 23:40:00 1984

Summary:

Human brain store ~1000 gigabytes?? Come on! Humans max out well below
80 bits/sec, so no more than ~10-30 Gbyte is needed... (so maybe we got
some spares, huh?)

Discussion:

Actually, quite a bit of work has been done on this by quite a few
experimental psychologists.  The classic paper on human "bandwidth"
is, of course,

	G. A. Miller, "The Magical Number Seven, Plus or Minus Two:
	Some Limits on Our Capacity for Processing Information",
	The Psychological Review, 1956

Miller defines "processing capacity" in terms of "absolute judgments",
i.e., the ability to discriminate among stimuli (e.g., "which of the N
tones is this tone?"). The information per trial (e.g., correctly picking
one of eight tones is 3 bits) is adjusted for error rate:

	"...the observer is considered to be a communications
	channel... The experimental problem is to increase the
	amount of input information and to measure the amount
	of transmitted information. If the observer's absolute
	judgements are quite accurate, then nearly all the input
	information will be transmitted and will be recoverable
	from his responses. If he makes errors, the transmitted
	information may be considerably less than the input. We
	expect that, as we increase the amount of input information,
	the observer will begin to make more and more errors; we
	can test the limits of accuracy of his absolute judgaments.
	If the human observer is a reasonable kind of communication
	system, then when we increase the amount of input information
	the transmitted information will increase at first and will
	eventually level off at some asymtotic value. This asymtotic
	value we take to be the 'channel capacity' of the observer;
	it represents the greatest amount of information that he can
	give us about the stimulus on the basis of absolute judgements..."

Plotting many previous experimenters' data (plus some of his own), he
shows that the human ability to discriminate stimuli for uni-dimensional
stimuli (pitch only, or brightness only, or linear position only) is
about 2.6 bits, or correctly picking one of six equally likely choices.
The highest capacity channel observed was about 3.5 bits (10-15 choices),
when picking pointer positions off a line. [Hmmm... like interpolating
between gradations on a meter stick.] The lowest capacity was for
taste intensities, about 1.9 bits.

With multidimensional stimuli (i.e. pitch AND loudness AND duration, etc.),
channel capacity goes up, but even with 6-8 dimensions the information
per decision was not more than about 7 bits. By grouping items into
sequences, the total information increases, although the information
per item goes down, so that short-term memory recalls of 40 bits or so
were demonstrated (with the aid of considerable re-coding).

The really important contribution of the paper, I am skipping over --
the ability of humans to "re-code" or "chunk" their input, so as to handle
more data. (In the memory test above, the "re-coding" was to use octal,
hex, and base-32 numbers to remember strings of binary digits.)

[Note: this paper has been used as a standard reference to show why, for
example, function keys on a keyboard should be clustered in groups of
four or five.]

Instead, look at what this says about total human bandwidth. As an UPPER
limit, let us assume that we can correctly and consistently and continually
absorb and process input stimuli at 8 bits per event (higher than ANY shown
in the lab!) at 10 events per second (faster than one event per reaction-time).
The would put our input processing at 80 bits/second (which is FAR too high!).

Note that this has little to do with reading speed, since estimates of
the information content of English range as low as 1.1 bits/word, once
the contextual environment is built up. (Try cutting every third word
out of newspaper stories... you'll be surprised how much is left!)

Also, due to re-coding, we are constantly editing our input to maximize
the "quality" of those bits. (See Frank Herbert's "Destination Void" for
a fascinating discussion of consciousness as mediator of perception. Watch
your own mind sometime to see how things in the environment come into your
awareness and disappear again, all the time.)

Again, 80 b/s is a somewhat excessive upper limit. Try reading and
REMEMBERING 10 char/sec of random text, continuously! Even so, at 24 hours
a day (no sleep?), 100 years per life, remembering everything perfectly,
one needs only about 30 gigabytes of long-term memory. Fits on a couple a
Betamax cassetees, easy! [Note: 2 bits/Hz, 75% utilization of each scan line,
a good Reed-Solomon code on top of rate-1/2 Viterbi, gives over 1.5 Gbyte
per hour of play time ==> ~3.5 six-hour tapes.]

(Actually, this makes the science-fiction ideas about personality/learning
transfer seem almost attainable, if only...)

In fact, the actual data rate and storage are probably far less. I would
dare say less than ONE Beta tape! The trick is in coding ("chunking") the
data. Anybody want to try and Huffman-code a lifetime?

Rob Warnock

UUCP:	{ihnp4,ucbvax!amd70,hpda,harpo,sri-unix,allegra}!fortune!rpw3
DDD:	(415)595-8444
USPS:	Fortune Systems Corp, 101 Twin Dolphin Drive, Redwood City, CA 94065