Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Path: utzoo!utgpu!water!watmath!clyde!cbosgd!ucbvax!OZ.AI.MIT.EDU!MINSKY
From: MINSKY@OZ.AI.MIT.EDU.UUCP
Newsgroups: comp.ai.digest
Subject: AIList Digest   V5 #170
Message-ID: 
Date: Mon, 6-Jul-87 16:29:00 EDT
Article-I.D.: MIT-OZ.MINSKY.12316273277.BABYL
Posted: Mon Jul  6 16:29:00 1987
Date-Received: Sat, 11-Jul-87 13:45:13 EDT
References: 
Sender: daemon@ucbvax.BERKELEY.EDU
Distribution: world
Organization: The ARPA Internet
Lines: 15
Approved: ailist@stripe.sri.com


I would like to see that discussion of "symbol grounding" reduced to
much smaller proportions because I think it is not very relevant to
AI, CS, or psychology.  To understand my reason, you'd have to read
"Society of Mind", which argues that this approach is obsolete because
it recapitulates the "single agent" concept of mind that dominates
traditional philosophy.  For example, the idea of "categorizing"
perceptions is, I think, mainly an artifact of language; different
parts of the brain deal with inputs in different ways, in parallel.
In SOM I suggest many alternative ways to think about thinking and, in
several sections, I also suggest reasons why the single agent idea has
such a powerful grip on us.  I realize that it might seem self-serving
for me to advocate discussing Society of Mind instead.  I would have
presented my arguments in reply to Harnad, but they would have been
too long-winded and the book is readily available.