Path: utzoo!utgpu!watmath!clyde!att!whuts!homxb!homxc!marty
From: marty@homxc.UUCP (M.B.BRILLIANT)
Newsgroups: comp.ai
Subject: Re: Feeling and thought: which comes first?
Message-ID: <4524@homxc.UUCP>
Date: 9 Dec 88 18:15:59 GMT
References: <2609@datapg.MN.ORG>
Organization: AT&T Bell Laboratories, Holmdel
Lines: 63

In article <2609@datapg.MN.ORG> sewilco@datapg.MN.ORG (Scot E Wilcoxon) wrote:
> In article <5626@sdcsvax.UCSD.EDU> pluto@beowulf.UCSD.EDU writes:
>>One net-poster remarked that emotions and feeling are a natural
>>by-product of thought.  I imagine that thought is a natural by-product
>>of feeling and emotion.
> 
> Emotions and feelings are a "natural byproduct of" how our Terran bodies
> and minds function.......
> Some "feelings" are also triggered by instinct or feedback....  Feedback
> can cause feelings either due to memories triggering neuronal activity
> which are a "memory" of past feelings, or due to thoughts causing
> limbic-detected chemicals ("hormones") to be produced.

I think I agree basically with both.  I have a concept that is probably
not testable, but which I find comfortable.  Maybe it's a sort of
extended definition.

Reason (the word ``thought'' is ambiguous) is something both computers
and humans can do.  But reason does not lead to decisions unless goals
are defined.  Goals are not rationally derivable, except from more
fundamental goals.

Survival is a goal.  Hunger, fear, etc. are feelings that tell a human
that survival is at risk.  Joy, relief, etc. are feelings that tell a
human that a goal is being met.  Feelings clue us in to what our goals
are, and then we use reason to further define those goals and decide
what course of action would attain those goals.

Ordinarily a human can take a course of action that will produce good
feelings in the long run, without causing bad feelings in the short
run, and we call that ``rational behavior.''  When a human takes a
course of action that produces good feelings immediately, but bad
feelings later, we say he/she is ``not behaving rationally.''  I think
all our actions are driven fundamentally by our need to feel good,
which is built in to help insure survival, though it sometimes fails.

Reason is built into computers, in the instruction set.  Goals are not.
Any goals a computer might have must be programmed into it.  But they
would then function in pretty much the same way feelings function in
humans.  When you feel pain you know something is wrong; it is a
feeling that says you should stop what you are doing.  Ordinarily,
computers are programmed to give a message to a human when something is
wrong.  If a computer is to handle such a situation without human
intervention, it must have a hierarchy of goals.

That is not to say that a computer must have feelings in the way we
know we have them.  I would say our feelings are a set of interrupts
that we use to tell us how well our actions meet our goals.  If a
computer is to handle multiple interrupts without human intervention
(in a way that helps it survive and do what it is supposed to do) it
needs something that does for it what feelings do for us.

What I just wrote doesn't look rigorously logical to me, and I don't
intend to prove it.  I take it as a working hypothesis.  It helps me to
conceptualize a world in which humans are intelligent, rats learn, and
``artificial intelligence'' is discussed.  It might help someone else.

M. B. Brilliant					Marty
AT&T-BL HO 3D-520	(201) 949-1858		Home (201) 946-8147
Holmdel, NJ 07733	att!houdi!marty1

Disclaimer: Opinions stated herein are mine unless and until my employer
	    explicitly claims them; then I lose all rights to them.