Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site ames.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!godot!harvard!seismo!hao!ames!barry
From: barry@ames.UUCP (Kenn Barry)
Newsgroups: net.philosophy
Subject: Re: sentience and 'meat'
Message-ID: <694@ames.UUCP>
Date: Wed, 12-Dec-84 13:34:30 EST
Article-I.D.: ames.694
Posted: Wed Dec 12 13:34:30 1984
Date-Received: Fri, 14-Dec-84 05:41:27 EST
References: <560@wucs.UUCP>
Distribution: net
Organization: NASA-Ames Research Center, Mtn. View, CA
Lines: 43

[]
	From Paul V. Torek, wucec1!pvt1047:

> From: barry@ames.UUCP (Kenn Barry)
>> I'm sure if I were uploaded to a silicon brain I would change, and change
>> in ways that would not have occurred if I hadn't been moved to different
>> hardware. But, hey, I change every day, anyway. Whatever changes occurred
>> would, I think, be gradual enough that I would still have the continuous
>> sensation of "selfness".
> 
> My point was that silicon is so different that you would have no reason
> to expect *any* sensations at all; that is, our sentience would seem to
> be tied up with the particular type of "meat" found in animals' brains.
> Can anybody argue against such a connection?

	Well, I'll try. Leaving aside consciousness for the moment, let
us at least hypothesize a technology that can emulate in silicon all
the mechanistic aspects of brain function. Now, suppose I suffer damage
to some subsystem in my brain, and have it replaced by a silicon substitute.
Are there any subsystems in the brain for which you feel this replacement
would eliminate consciousness? Any combination of subsystems? Would I
still be me with an electronic visual cortex? With an electronic
hypothalamus? With both?
	My point is that believing consciousness is tied to our particular
hardware seems to force one to accept one of two difficult propositions:
either that consciousness becomes impossible at some seemingly arbitrary
dividing line between "all meat" and "all silicon", or that there is
some necessary precondition for consciousness which is separate from
the physical mechanism which houses it, a "soul", if you will. If a
non-mechanistic soul is necessary for consciousness, it is reasonable
(though not required) to maintain it is not portable to other hardware.
At the present time, however, I see no reason to believe that the physical
operation of the brain is not a sufficient explanation of consciousness.
I am therefore inclined to think that my "software", consciousness included,
is theoretically portable. 
	Comments, as always, are welcome.

-  From the Crow's Nest  -                      Kenn Barry
                                                NASA-Ames Research Center
                                                Moffett Field, CA
-------------------------------------------------------------------------------
 	USENET:		 {ihnp4,vortex,dual,hao,menlo70,hplabs}!ames!barry
	SOURCE:	         ST7891