Path: utzoo!utgpu!attcan!uunet!husc6!mailrus!cornell!rochester!pt.cs.cmu.edu!*!postman+ From: ap1i+@andrew.cmu.edu (Andrew C. Plotkin) Newsgroups: comp.ai Subject: Re: Artificial Intelligence and Intelligence Message-ID:Date: 29 Nov 88 23:04:27 GMT References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <1791@cadre.dsl.PITTSBURGH.EDU> <1918@crete.cs.glasgow.ac.uk> <44150@yale-celray.yale.UUCP>, <281@esosun.UUCP> Organization: Carnegie Mellon Lines: 46 In-Reply-To: <281@esosun.UUCP> />Of course not. Intelligent machines won't act much like humans at />all. They will have different needs, different feelings, different />goals, plans, desires for life than we. But they'll be no less />intelligent, thinking, feeling beings than we, for it. / / I can accept the needs, goals, and plans... but why does everyone / assume that an intelligent machine would be a *feeling* being? I see / no reason to assume that an IM would be capable of experiencing / anything at all. This doesn't imply that it wouldn't be intelligent. / For instance: some machines are already capable of distinguishing blue / light from red. This doesn't mean that they have anything like our / *experience* of blue. (Or pain, or sorrow, or pleasure... etc.) Aren't feeling and emotion automatic byproducts of thought? I find it hard to imagine an entity capable of thought which would *not* have (for example) some negative reaction to a threat, involving some attempt to rapidly find alternatives, etc... which is to say, fear and excitement. Other emotional responses can be argued similarly. It's true that our physical bodies provide extra sensation by pumping out adrenaline and so forth, but the original emotions are often generated by the mind first and -then- relayed to the glands; for example, the panic you feel when a pay cut notice appears on your desk. (One can still argue that "emotional reactions" don't prove that the machine is "really feeling emotion." I'll be glad to answer this point, if you can first prove to me that *you* "really feel" pain, sorrow, or pleasure, and don't just react mechanically...) / I would rather not have a / machine that I would be afraid to turn off for fear of harming / *someone*. If you don't want a "someone", you'd better stay out of AI research... :-) / It does seem that our experience is rooted in some kind of / electro-chemical phenomenon, but I think it is an incredible leap of / faith to assume that logic circuits are all that is required :-). It's a point of faith, certainly, since we don't have more than one example. But I don't think it's unjustified, since nothing more than logic circuits has ever been observed. ("Logic circuits" is a bit of a misnomer, since neurons don't act like single standard logic gates. However, it looks possible to simulate them with electronics.) --Z