Path: utzoo!utgpu!watmath!clyde!att!rutgers!rochester!pt.cs.cmu.edu!andrew.cmu.edu!ap1i+ From: ap1i+@andrew.cmu.edu (Andrew C. Plotkin) Newsgroups: comp.ai Subject: Re: Artificial Intelligence and Intelligence Message-ID:Date: 4 Dec 88 06:55:18 GMT References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <1791@cadre.dsl.PITTSBURGH.EDU> <1918@crete.cs.glasgow.ac.uk> <44150@yale-celray.yale.UUCP>, <281@esosun.UUCP> , <283@esosun.UUCP> Organization: Carnegie Mellon Lines: 70 In-Reply-To: <283@esosun.UUCP> />Aren't feeling and emotion automatic byproducts of thought? I find it hard to />imagine an entity capable of thought which would *not* have (for example) some />negative reaction to a threat, involving some attempt to rapidly find />alternatives, etc... which is to say, fear and excitement. Other emotional />responses can be argued similarly. /> / / I agree that any thinking entity would have a negative reaction to a threat, / involving some attempt to rapidly find alternatives. I just don't see this / as being "fear" and "excitement". Let me explain why with an analogy: / / Why does a person take aspirin? I don't believe that the following / goes on in his head -- "I say, It appears that those neurons over / there are firing excessively. Perhaps I should interrupt their overly / enthusiastic behavior..". I claim it is more like: "Owww... that really / *hurts*. Gimme some aspirin... NOW!" Although the physical effect of / the aspirin might be to cut off some signal in the nervous system, this / has very little to do with a person's immediate motivation for taking it. / I claim that the signal and the pain are two entirely different sorts of / beasts. Even today we have computer programs that have no "idea" (no access to) what goes on in their lower levels (the machine language.) A Lisp program manipulates lists without ever referring to the RAM addresses they're stored at. This seems to me to be an exact equivalent (much simpler of course) to the way we don't worry about what neurons are firing. If an AI is written in a high-level language, I would expect that it would have no idea of what routines are running it. Similarly, if an AI is developed by making a big neural net and kicking it, it would not know what sort of patterns are running around its circuits. It would just react by saying things like "OW!" / I think we will be able to generate / systems to perform any tasks we can think of... even simulate a human! / It seems unlikely that the same approach will generate artificial / *beings* with subjective experience, but this is just fine with me. ;-) You mentioned torture -- if you had a computer console in front of you with a button marked "pain" (any human simulator had better have some sort of sensory input), would you consider it okay to push it? How about if a screen (or speaker) was printing the output produced by the program as it did a nice simulation of a human begging you to stop? If you first spent an hour or so discussing your favorite music or movies? (Excuse me; I mean "using the simulation to see how the human being simulated would respond to your opinions on music or movies.") Yes, I know, that paragraph is intended to play on your sympathy. But consider it seriously. You go into a room, spend a while talking to and getting answers from a computer console. Being a produced by a good human simulation, the conversation is typical of any you'd have with a random stranger. Would you then feel morally justified pushing that button, saying "it's only a simulation"? />"really feeling emotion." I'll be glad to answer this point, if you can first />prove to me that *you* "really feel" pain, sorrow, or pleasure, and don't just />react mechanically...) / / I've heard people (usually behaviorists) make this point but I'm never sure / if they're serious (I didn't see a smiley :-). An attempt to answer the / riddle of subjective experience by denying its existence seems somewhat / pointless. I'm not -denying- subjective experience. I'm saying that, -whatever subjective experience is-, if system X and system Y both act the same, it's silly to say X has subjective experience and Y doesn't. Especially when the only difference is that X grew by itself and Y was built by X. This subject has been covered by more impressive people than me; dig up _The Mind's I_ by Hofstadter and Dennet, which has many essays (and fiction) by lots of people and commentary by H & D, all on AI and minds and brains and stuff. Fun to read, too. --Z