Re: Grandmother objections

From: Harrison, Richard (RJH93PY@psy.soton.ac.uk)
Date: Thu Dec 14 1995 - 12:01:09 GMT


Hi Stevan,

> From: harnad@cogsci.soton.ac.uk (Harnad, Stevan)
> > From: "Lucas, Melody" <MFL93PY@psy.soton.ac.uk>
 
[Grandmother objections]
> (11) Computers are isolated from the world; we are not.
>
> [Computers can be as interactive with the world as their input/output
> devices make them.]
>
> A nongrandmotherly version of this objection, however, points to the
> symbol grounding problem: The symbols in a computer are ungrounded; our
> brains are not.

And this is Searle's point, isn't it? If machines are only
programs are computational symbol manipulators then they do not have
understanding in the way we do (even if they can pass either version
of the Turing test).

This seems reasonable to me (am I missing something?). You said the
main objection to the Chinese Room argument is that the System as a
whole 'understands' Chinese even if the English speaker in the room
doesn't. This objection appears to be easily countered by the man
internalising the rules and answering Chinese questions outside.

So, we move onto the symbol grounding problem as it is the only
fundamental difference between biological machines/minds (whatever)
and nonbiological 'built' machines. There seems to be a problem in
grounding (giving meaning) to symbols in the latter as in us (and other
animals) there are consequences to getting it right and wrong whereas
programming a computer to care ('be careful or we'll unplug you...')
doesn't seem to be the same. (I think this was the conclusion Denise
came to in the seminar (?)).

> > > Grandmother-objections are the ones that everyone always
> > > raises at first; rare objections whatever their direction.

Am I getting anywhere with this or am I a Grandmother in disguise?

Also, where has all this left our definition of machine? Of mind?

Richard



This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:56 GMT