Xref: utzoo comp.ai:2727 talk.philosophy.misc:1640
Path: utzoo!utgpu!watmath!clyde!att!rutgers!mailrus!ames!amdahl!pyramid!thirdi!metapsy!sarge
From: sarge@metapsy.UUCP (Sarge Gerbode)
Newsgroups: comp.ai,talk.philosophy.misc
Subject: Re: Artificial Intelligence and Intelligence
Message-ID: <562@metapsy.UUCP>
Date: 29 Nov 88 05:43:51 GMT
References: <484@soleil.UUCP> <1654@hp-sdd.HP.COM> <1908@crete.cs.glasgow.ac.uk> <1791@cadre.dsl.PITTSBURGH.EDU> <819@novavax.UUCP> <1811@cadre.dsl.PITTSBURGH.EDU> <757@quintus.UUCP>
Reply-To: sarge@metapsy.UUCP (Sarge Gerbode)
Organization: Metapsychology, Woodside, CA
Lines: 55

In article <757@quintus.UUCP> ok@quintus.UUCP (Richard A. O'Keefe) writes:
>In Chapter 2 of  "Ten Philosophical Mistakes", Mortimer J. Adler says
>
>	...
>	For [humans] as well as for [animals], mind or intelligence
>	stands for faculties or powers employed in learning from
>	experience and in modifying behaviour in consequence of such
>	learning.
>
>This definition of intelligence would appear to be one that could
>meaningfully be applied to machines.

The significance of this definition would depend on what is to be
included as "learning".  A mere modification of behavior based on a
change of environment would not, to my mind, qualify as "learning".
For instance, the switching action of a thermostat in response to
environmental changes in temperature would not entitle it to be
considered to have "learned" anything, nor to be considered
intelligent.

And a person can exercise intelligence without behaving (physically),
e.g. by thinking up a brilliant thought.  Some very intelligent people
("effete intellectual snobs", I believe they used to be called :-) )
are very good at not applying their intelligence to real life.

So the "behavior" part is extraneous to intelligence.  It is the
"learning" that is crucial.

We could say that anything that could learn could be intelligent.  Or,
intelligence is the ability to learn.  Intelligence tests were
originally designed to predict school performance, i.e. learning
ability, so that would fit this definition.

The next question is whether machines could be said to "learn" in
anything but a metaphorical sense.  Perhaps they can be taught to
behave in a way that imitates behavior that is thought to be
consequent to actual learning, but would that mean that they actually
"learned" anything?

Each of us humans has direct subjective apperception of what it is to
learn -- it is to acquire knowledge, to come to know something, to
acquire a fact.  What we do behaviorally with what we learn is another
matter.

Do machines have the same subjective experience that we do when we
say we have learned something, or any subjective experience at all?
It seems quite questionable.  Since their behavior is completely
explainable in terms of the hardware design, the software program,
and the input data, Occam's Razor demands that we not attribute
subjectivity to them.
-- 
--------------------
Sarge Gerbode -- UUCP:  pyramid!thirdi!metapsy!sarge
Institute for Research in Metapsychology
950 Guinda St.  Palo Alto, CA 94301