Path: utzoo!utgpu!watmath!clyde!att!osu-cis!tut.cis.ohio-state.edu!bloom-beacon!apple!voder!pyramid!prls!philabs!linus!mbunix!bwk
From: bwk@mitre-bedford.ARPA (Barry W. Kort)
Newsgroups: comp.ai
Subject: Re: Artificial Intelligence and Intelligence
Summary: Simulated pain vs. real pain.
Keywords: Reports of Distress vs. Distress
Message-ID: <42468@linus.UUCP>
Date: 6 Dec 88 04:28:25 GMT
References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <1791@cadre.dsl.PITTSBURGH.EDU> <1918@crete.cs.glasgow.ac.uk> <44150@yale-celray.yale.UUCP> <281@esosun.UUCP>  <283@esosun.UUCP>  ap1i+@andrew.cmu.edu
(Andrew C. Plotkin) writes:

 > ... if you had a computer console in front of you with a button
 > marked "pain" (any human simulator had better have some sort of sensory
 > input), would you consider it okay to push it? 

Yes, if I was testing the computer's pain circuits.  When a computer
is in pain (i.e. a circuit board is burning out, or a cable is being
cut), I want to be sure that it can sense its distress and accurately
report its state of well-being.  Similarly, if I put the machine in
emotional pain (by giving it a program that runs forever and does
no useful work), I hope the machine can diagnose the problem and
gracefully aprise me of my error.  Getting an incomprehensible
core dump is like having a baby throw up because something it ate
was indigestible.  (I find core dumps indigestible.)

Barry Kort