Path: utzoo!censor!jeff
From: jeff@censor.UUCP (Jeff Hunter)
Newsgroups: comp.ai
Subject: Re: Artificial Intelligence and Intelligence
Summary: intelligence without emotions
Message-ID: <163@censor.UUCP>
Date: 6 Dec 88 05:12:03 GMT
References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <177@iisat.UUCP>
Organization: Bell Canada, Business Development, Toronto
Lines: 57

In article <177@iisat.UUCP>, paulg@iisat.UUCP (Paul Gauthier) writes:
> In article <281@esosun.UUCP>, jackson@freyja.css.gov (Jerry Jackson) writes:
> > I can accept the needs, goals, and plans... but why does everyone
> > assume that an intelligent machine would be a *feeling* being?  I see
> > no reason to assume that an IM would be capable of experiencing
> > anything at all.  This doesn't imply that it wouldn't be intelligent.
> 	Actually, I think it does. Feelings are simply products of
> intelligence. Once any form of intelligence reaches the complexity of
> the human mind it will undoubtably experience 'feelings.' Feelings are
> simply manifestations of the minds goals and needs. You feel 'sad' when

    I disagree. Emotions form a relatively simple reasoning system. (Lest 
you get the wrong impression from the start let me hasten to add that I
enjoy my emotions [most of the time at least]. I'm just saying that they're
not all that bright.)
    For example : I like you. I share with you. You like me. You share back.
There's a viable society without needing deep thought on the economics of
co-operation vs competition, or long computer modelling runs, etc...
"Like", "trust", "distrust", and "hate" form such useful models of 
behaviour that just about any mammal or bird use them to reason about
relationships.
    I assume that any relatively dumb intelligences that need to operate
in some social environment would need some similar shortcuts to reason with.
Smarter intelligences "evolved" from the dumb ones would probably retain
the emotions just from design economy.

    Emotional reasoning can often outperform logical reasoning (watch any
episode of Star Trek :-). Lots of people have stopped smoking because of
guilt rather than reasoned argument. However emotions (especially strong
ones) can make people do really dumb things too. Blind love and blinding
hatred are cliches.

    If I was dealing everyday with an artificial intelligence then I'd 
prefer it to have human-like emotions (or at least dog-like). I'd make 
an emotional attachment and I'd be sort-of disappointed if it declared
me surplus protein and fed me to the grinder :-).

    However an intelligence that didn't have to interact with others
wouldn't need to be run by emotions. A robot asteroid miner, for example,
could be set the task of converting a planetoid into can openers and 
paper weights. It wouldn't have to have a favourite ore truck, or be
pleased with the day's output, or panic and freeze if a major mechanical
failure happens. It wouldn't even have to feel disappointed or noble
as it melts itself down to make the last crate of paper weights.
    Conversely I could see an emotional version of the same machine
that could probably do just as good a job. (The emotions would have 
to be adjusted from human norms though.)

    In summary I think that intelligence doesn't require emotions, but
near-term "real" artificial intelligences will need them to interact
with humans, and the emotions will probably hang around unless they
are degined out for a purpose.
-- 
      ___   __   __   {utzoo,lsuc}!censor!jeff  (416-595-2705)
      /    / /) /  )     -- my opinions --
    -/ _ -/-   /-     The first cup of coffee recapitulates phylogeny...
 (__/ (/_/   _/_                                    Barry Workman