> From: james grady, jrg197@ecs.soton.ac.uk (i.e. not sent from my uni address)
>
> Could there be a purely T3 robot? (i.e. a robot which passes T3 but not T5)
You are asking a functional question: Could a system pass T3 that was
less than a system that could pass T5 (or T4)?
> A T3 robot would have to be externally indistinguishable from a human
> by a human. Now to do this it would have to be symbolically grounded in
> the same/similar way as/to a human. However this is impossible.
It only means T3 has to be able to DO everything a human can do (that's
the TT). So the symbols need to be grounded, but there are no rules
about HOW they should be grounded (nor about their actual history).
> A T3
> robot is not going to have the same weaknesses and needs as a human.
One human hasn't the same weaknesses and needs as another. All T3 needs
is generic capacities, indistinguishable from one of us.
> For example an apple is going to be a red/green fruit to both machines
> (inc. human) however it is only going to mean 'nourishing' to the
> human. Therefore a T3 robot IS going to be distinguishable.
It is only distinguishable if it is distinguishable (that's a
performance-capacity question).
I'm not sure what you mean by "mean". If you mean "grounded" then all
that asks for is T3 power in the world. Many people have never eaten an
apple. (Some perhaps have never seen red!) We don't know whether the
T3 robot (or anyone) really feels anything; we ask only for T3 power
with apples.
Or are you saying that to ground "eating," you have to be a system that
really eats? (If so, why? You could be right, but why?)
> In the same way you would need a T3 robot to pass T2, you would need a
> T5 robot to pass T3. A robot must not only interact with the
> environment but also do it in the same way as a human. Otherwise it
> would not pass T3.
First, why T5? Isn't T4 (synthetic system, eating real or synthetic
apples) the same for your argument?
But, if you are right (and you could be) that T4 power is somehow
needed to get T3 power, you need to tell us how/why. The grounding story
explains why you might need T3 power to get T2 power. What is the T4/T3
counterpart to that problem?
> This to me seems to be a fundamental weakness of Turing. Would it be
> enough to have a test in which a robot has to convince a panel of
> judges that it is able to think (i.e. not just that it is human).
> Surely if it is able to convince a panel of judges, say over a period
> of a week (or even years), that it was thinking then it WOULD be
> thinking.
I couldn't follow that. Why is 1 week special? And convince them it
could think (= ?) how? Via T2 (language alone) or via T3? We are back
where we started.
--------------------------------------------------------------------
HARNAD, Stevan harnad@cogsci.soton.ac.uk
Professor of Cognitive Science harnad@princeton.edu
Department of Electronics and phone: +44 23-80 592-582
Computer Science fax: +44 23-80 592-865
University of Southampton http://www.cogsci.soton.ac.uk/~harnad/
Highfield, Southampton http://www.princeton.edu/~harnad/
SO17 1BJ UNITED KINGDOM
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:36:29 GMT