Path: utzoo!utgpu!watmath!clyde!att!rutgers!mcnc!xanth!ames!amdahl!uunet!dalcs!iisat!paulg
From: paulg@iisat.UUCP (Paul Gauthier)
Newsgroups: comp.ai
Subject: Re: Artificial Intelligence and Intelligence
Summary: Feelings aren't only for humans...
Message-ID: <177@iisat.UUCP>
Date: 1 Dec 88 23:14:31 GMT
References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <281@esosun.UUCP>
Organization: International Information Service, Dart., NS
Lines: 58

In article <281@esosun.UUCP>, jackson@freyja.css.gov (Jerry Jackson) writes:
> In article <44150@yale-celray.yale.UUCP>, engelson@cs (Sean Philip Engelson) writes:
> >all.  They will have different needs, different feelings, different
> 
> I can accept the needs, goals, and plans... but why does everyone
> assume that an intelligent machine would be a *feeling* being?  I see
> no reason to assume that an IM would be capable of experiencing
> anything at all.  This doesn't imply that it wouldn't be intelligent.
	Actually, I think it does. Feelings are simply products of
intelligence. Once any form of intelligence reaches the complexity of
the human mind it will undoubtably experience 'feelings.' Feelings are
simply manifestations of the minds goals and needs. You feel 'sad' when
you don't attain a goal, this is simply a negative feedback response to
prd you into trying harder. It might not work in all cases, but it helps.
The word feeling is very broad. Feelings of fear are manifestations of your
minds attempts to deal with the unkown or threats. What you experience as
fear is the workings of your mind trying to come to a decision in a tough
situation.
	This whole topic is very hard to discuss and I'm sure I've
bungled it quite nicely, but I hope I have put accross something resembling
my true opinion on this. All this things people refer to as feelings,
things which many consider for humans-alone, are results of inconsistancies
in our knowledge-bases and signs of our intelligense working. A feeling is
an educated guess that our mind makes based on what it can puzzle out
from known facts. As you can see, the word 'feeling' doesn't do well to
classify all the myriad of types of feeling there are so it is hard to
discuss...

> For instance: some machines are already capable of distinguishing blue
> light from red.  This doesn't mean that they have anything like our
> *experience* of blue. (Or pain, or sorrow, or pleasure... etc.)

	All your *experience* of blue is is your brain searching its
memory to figure out what 'blue' is. Undoubtably it flashes through
memories connected to 'blue' which trigger the *experience* of blue. When
machines have large enough inter-connected knowledge-bases they too will
come accross past experiences which relate to blue and *experience* the
color.

> Personally, I think this is a good thing.  I would rather not have a
> machine that I would be afraid to turn off for fear of harming
> *someone*.  It does seem that our experience is rooted in some kind of
> electro-chemical phenomenon, but I think it is an incredible leap of
> faith to assume that logic circuits are all that is required :-).

	Personally, I find the idea exciting. I'm patiently waiting for
the first machine sentience to emerge. I feel it is possible, and it is
only a matter of time. After all, humans are only carbon-based machines.

> 
> --Jerry Jackson


-- 
|=============================================================================|
| Paul Gauthier:    {uunet, utai, watmath}!dalcs!iisat!{paulg | brains!paulg} |
|                   Cerebral Cortex BBS: (902)462-7245  300/1200  24h/7d  N81 |
|==============================================================================