Re: Does Evolution Have a Mind?

From: HARNAD, Stevan (harnad@coglit.ecs.soton.ac.uk)
Date: Sun Mar 05 2000 - 17:12:40 GMT


On Sun, 5 Mar 2000, Cliffe, Owen wrote:

> On Sun, 5 Mar 2000, HARNAD, Stevan wrote:
> > One can simulate the rest too, of course, but that produces a virtual
> > creature in a virtual world the same way the plane simulation produces
> > a virtual plane virtually flying in virtual air (i.e., squiggles and
> > squoggles systematically interpretable by US as flying, but in reality
> > just squiggling and squoggling). And just as virtual cakes are
> > inedible, and virtual fire non-flammable, virtual minds are non-mental!
>
> ok , but you would still have something that could pass T2, wouldn't you?
> as the virutal-real interface for symbolic converation is surpassable
> isnt it?
>
> sorry for going round in circles :)

You're absolutely right. T2 can in principle be passed by an
implementation-independent symbol system, and that's the loophole
that Searle exploits.

But, strictly speaking, T2 only applies to the human pen-pal TT
(symbols-i, symbols-out), not to virtual creatures in virtual worlds.
The robotic TT (T3) can only be conducted in the real world. So a
virtual robot in a virtual world is not passing any kind of TT
(although, as an oracle, it might be successfully "test-piloting"
everything it would take to design a robot that really would pass T3).

So a virtual robot might, among other things, be able to pass the real
T2 (I don't think so, but it is logically possible). It is more likely
to be the other way round, however: It is likely to require no less
than a real T3 robot even just to pass T2 (even though T2 doesn't test
any of its T3 powers directly). To anticipate the solution to the
symbol grounding problem (coming soon): T2 power may need to be
grounded in T3 power.

Stevan



This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:36:27 GMT