From: Hudson Joe (email@example.com)
Date: Thu Mar 22 2001 - 13:49:44 GMT
> Yes. But so what? What does that performance requirement have to do
> with the question of whether or not cognition equals computation, or
> of whether or not cognition must have a real-time past?
Well if there is a performance requirement in real-time functionality
then its not implementation independent is it? If it is not then it
can't be pure computation can it?
I think a real-time past is obviously unnecessary for a successful
T3 candidate, but as for a mind, I don't know.
> Does a worm feel pain? I don't know. Lets suppose for a moment it could
> be aware of pain or other sensations.
> If a worm feels, then it is just as relevant to what we have been
> discussing as you or I are. If it does not feel, then it is just as
> irrelevant as a stone.
> Because of the other-minds problem, there's no way for anyone to know
> the truth except the worm (and it's too stupid to worry about it). My
> guess is that a worm does feel.
> Who is being aware? The worm. Who is the worm? No one its just a worm.
> So what?
> Then what does a pinch mean to a worm? If we get pinched the feeling is
> always: " 'I' am in pain. "
> That's what it is to a sophisticated person like you, or Descartes.
> But to a simpler person, or an infant, or someone in a delirious
> state, the pain is still there, it still feels like something, it's
> still being felt -- it's just that all those other fancy "cogito ergo
> sum" ("I think therefore I am") thoughts are not going on.
> But we are talking about whether a system has a mind at all; not just
> about whether it happens to be sophisticated enough to think those
> fancy thoughts. I didn't ask about what a pinch "means" to a worm; just
> about whether or not it FEELS it.
I wasn't talking about THOUGHTS of the "I am feeling X" kind I was
refering to FEELINGS that can only be described using words as "I am
feeling X". I quite agree being able to reflect on being able to reflect,
or on a particular feeling is unnecessary for something to feel.
If a pinch causes pain to a worm then a pinch means pain to a worm. The
meaning IS the feeling in this case.
> You could say the relevance of the sensation is
> 'grounded' in the sense of self. If there is no self how is sensation or
> feeling relevant and who is it relevant to?
> All you need to say is that the feeling is felt. It's felt by the
> feeler. All the rest is just theorizing about what's going on. We
> humans not only feel, hence think (a process is not a thought process
> unless it is felt by someone; otherwise it's just a Zombie process),
> but we also think ABOUT feeling and thinking (and self and AI). Those
> are the extras, not the essentials.
But this doesn't answer the question. There needs to be a 'sense' (not
necessarily a reflective thought!) of self for a creature to feel.
The creature must associate the feeling with itself (again not
necessarily at the "cogito ergo sum" level) or rather the creature's
experience of feeling is fused with its identity (conscious or not).
> All of that is just the specific CONTENT of our feelings and felt
> thoughts. We might just as well have thought only about eating and
> sleeping. It makes no difference. Implementing mental processes means
> implementing ANY of those, not particularly just the sophisticated
> ones. (And failing to implement them means failing to implement ANY of
> What and who is doing the feeling if there
> is no self awareness? Who is the 'I' in " I am in pain" with the worm?
> The feeler of the feeling is feeling it, whether or not it also happens
> to be capable of reflecting and theorizing on what is going on. (I
> warned you that the problem is awareness, not "self-awareness."
> Self-awareness is just a luxury that most normal, thinking adults
> also happen to have...)
I say awareness starts with self-awareness but by self-awareness I don't
mean the ability to reflect on ones own existence, I mean the 'sense' of
existence, and the existence of an identity.
> I think without the anchor point of a sense of self (i.e. self awareness),
> awareness has no meaning. I think you need to start by being aware of
> youself before you can be aware of anything else.
> Reflect for a while on whether "feeling" makes any sense without a
> "feeler." (Does "unfelt feeling" make any sense?) If you realize that
> part of the nature of feeling is that it's felt, then you have your
> answer. The rest is all just about whether or not the feeler happens to
> be sophisticated enough (i.e., smart enough, and montivated) to go the
> distance, all the way from feeling pain to "cogito ergo sum."
My original points could have been more clearly put.
> But nothing hinges on all that. Feeling the feeling is enough. And
> that's the hurdle AI must cross, between just implementing some form of
> (1) DOING (be it T2, T3, or T4) and implementing DOING FEELINGLY (which
> is what makes us intelligent creatures with minds, rather other than
> Zombies, which are like stones or steam-engines, but with a lot more
> performance capacity).
I agree this feeling ability is what gives us minds but not what makes us
> If we WERE Zombies, passing the Turing Test would be the DEFINITION of
> "intelligence." But we are not Zombies, so there is more to it than
> that. Trouble is, because of the other-minds problem, only the
> candidate itself, the feeler (if it really does feel), can know for
> sure whether or not our model has successfully implemented
> intelligence! Turing Testing can tell us only whether or not it has
> successfully implemented (Zombie) performance capacity.
Agreed from the start.
This archive was generated by hypermail 2.1.4 : Tue Sep 24 2002 - 18:37:25 BST