Re: Chalmers: Computational Foundation

From: HARNAD Stevan (harnad@coglit.ecs.soton.ac.uk)
Date: Wed Mar 21 2001 - 09:42:06 GMT


On Wed, 21 Mar 2001, jh798 wrote:

> >HARNAD:
> >A real-time history is probably the usual and the optimal way of
> >getting those things into a head, but if the current state could be
> >built in directly, without a real-time history, would that make the
> >system any different from what it would be if it had earned them the
> >honest way?
>
> >On the other hand, for T3 (or even T2) passing, there has to be a
> >forward-going history, starting from whenever the Testing starts. The
> >capacity for interacting in real time is part of T3.
>
> Hudson:
> Sure, I agree that in 'principle' if you could somehow download in a flash
> all the myriad experiences and remembered sensations and subtle personality
> traits of a mind (and lose nothing in the process) into some mechanical
> contraption capable of the same functionality of the realtime variant then
> of course both would (or could if they chose) be indistinguishable in their
> behaviour.
>
> But then how on earth could we possibly make a machine with such a
> phenomenal data assimilation capacity? (And would we be wise to do so?
> Rapid extinction sound good to anybody? )

Wisdom and practicality are not the issue here. The question was: Is
real-time (past) history somehow necessary for generating cognition?
The answer is: No. (But forward-going real-time history IS necessary
for testing and passing the TT from that moment onward.)

> >HARNAD:
> >It is not implementation-independence that is at issue with real-time
> >history. But symbol grounding might be part of what's at issue.
>
> Hudson:
> Originally I was thinking of computation pretending to be a mind when I
> wrote this. But then when something runs in real-time doesn't this
> place a certain performance requirement on the hardware as well as a
> functional one?

Yes. But so what? What does that performance requirement have to do
with the question of whether or not cognition equals computation, or
of whether or not cognition must have a real-time past?

> Hudson:
> Does a worm feel pain? I don't know. Lets suppose for a moment it could
> be aware of pain or other sensations.

If a worm feels, then it is just as relevant to what we have been
discussing as you or I are. If it does not feel, then it is just as
irrelevant as a stone.

Because of the other-minds problem, there's no way for anyone to know
the truth except the worm (and it's too stupid to worry about it). My
guess is that a worm does feel.

> Who is being aware? The worm. Who is the worm? No one its just a worm.

So what?

> Hudson:
> Then what does a pinch mean to a worm? If we get pinched the feeling is
> always: " 'I' am in pain. "

That's what it is to a sophisticated person like you, or Descartes.
But to a simpler person, or an infant, or someone in a delirious
state, the pain is still there, it still feels like something, it's
still being felt -- it's just that all those other fancy "cogito ergo
sum" ("I think therefore I am") thoughts are not going on.

But we are talking about whether a system has a mind at all; not just
about whether it happens to be sophisticated enough to think those
fancy thoughts. I didn't ask about what a pinch "means" to a worm; just
about whether or not it FEELS it.

> Hudson:
> You could say the relevance of the sensation is
> 'grounded' in the sense of self. If there is no self how is sensation or
> feeling relevant and who is it relevant to?

All you need to say is that the feeling is felt. It's felt by the
feeler. All the rest is just theorizing about what's going on. We
humans not only feel, hence think (a process is not a thought process
unless it is felt by someone; otherwise it's just a Zombie process),
but we also think ABOUT feeling and thinking (and self and AI). Those
are the extras, not the essentials.

All of that is just the specific CONTENT of our feelings and felt
thoughts. We might just as well have thought only about eating and
sleeping. It makes no difference. Implementing mental processes means
implementing ANY of those, not particularly just the sophisticated
ones. (And failing to implement them means failing to implement ANY of
them.)

> Hudson:
> What and who is doing the feeling if there
> is no self awareness? Who is the 'I' in " I am in pain" with the worm?

The feeler of the feeling is feeling it, whether or not it also happens
to be capable of reflecting and theorizing on what is going on. (I
warned you that the problem is awareness, not "self-awareness."
Self-awareness is just a luxury that most normal, thinking adults
also happen to have...)

> Hudson:
> I think without the anchor point of a sense of self (i.e. self awareness),
> awareness has no meaning. I think you need to start by being aware of
> youself before you can be aware of anything else.

Reflect for a while on whether "feeling" makes any sense without a
"feeler." (Does "unfelt feeling" make any sense?) If you realize that
part of the nature of feeling is that it's felt, then you have your
answer. The rest is all just about whether or not the feeler happens to
be sophisticated enough (i.e., smart enough, and montivated) to go the
distance, all the way from feeling pain to "cogito ergo sum."

But nothing hinges on all that. Feeling the feeling is enough. And
that's the hurdle AI must cross, between just implementing some form of
(1) DOING (be it T2, T3, or T4) and implementing DOING FEELINGLY (which
is what makes us intelligent creatures with minds, rather other than
Zombies, which are like stones or steam-engines, but with a lot more
performance capacity).

If we WERE Zombies, passing the Turing Test would be the DEFINITION of
"intelligence." But we are not Zombies, so there is more to it than
that. Trouble is, because of the other-minds problem, only the
candidate itself, the feeler (if it really does feel), can know for
sure whether or not our model has successfully implemented
intelligence! Turing Testing can tell us only whether or not it has
successfully implemented (Zombie) performance capacity.

Stevan Harnad



This archive was generated by hypermail 2.1.4 : Tue Sep 24 2002 - 18:37:25 BST