From: HARNAD Stevan (firstname.lastname@example.org)
Date: Tue Mar 27 2001 - 12:57:40 BST
On Mon, 26 Mar 2001, Joe Hudson wrote:
> if speed is relevent than at the very least cognition is not
> computation alone, which by definition is implementation independent,
> which would make speed irelevent. I don't see why this is a dificult point
> to accept.
It is not difficult to accept; it just doesn't have any strong
implications. Cognition is not just computation, it's computation
executed sufficiently quickly...
> > I can't quite follow your point. A worm feels, right? If we could get a
> > computer to feel too, a lot of our problems (about whether it's really
> > thinking, really intelligent, really has a mind) would be solved.
> We would need to understand how we got the computer to feel first
> (so, which bits of the computer and program are essential)! But before we
> could even know that we would need to be able to verify the feeling
> capacity as required. But this would have to be done without becoming
> the computers mind, so we would need to know how we get a computer to
> feel. Catch 22. You can't do one without the other.
> Its the other minds problem again.
> But this is besides my point. I'm trying to show intuitively that some
> sort of self identity is essential for something to feel. Due to the
> other minds problem this has to be done intuitively.
But even apart from the other-minds problem, I don't think you have
shown this. You have to be able to feel to feel: That's a tautology.
"Unfelt feelings" are self-contradictory. So it is part of "being able
to feel" to be a "feeler." But "self identity" (if it is anything more
than being the feeler of felt feelings (a rather wordy way of saying
just "feeling") is a theoretical notion that perhaps only sophisticated
Cartesian thinkers like you and Descartes have, whereas what we want is
something a worm has too.
> > So, in a nutshell, the answer to the question "Is the computer really
> > thinking, or just performing as if it were thinking?" is: "If it's
> > feeling, it's thinking; if not, not."
> When was this ever in question?
That was always the question: Is it really intelligent or just TTing? Is
it really thinking or just TTing? Does it really have a mind or is it
just TTing? The difference in each case is whether it feels.
> > Hudson
> > There needs to be a 'sense' (not
> > necessarily a reflective thought!) of self for a creature to feel.
> > The creature must associate the feeling with itself (again not
> > necessarily at the "cogito ergo sum" level) or rather the creature's
> > experience of feeling is fused with its identity (conscious or not).
> > I don't know what all this means. It seems to me all a system needs in
> > order to feel is to feel! All these other conditions sound like
> > speculation...
> You've lost me. So unless a discussion is backed-up with experimental
> evidence or theoretical proofs its pointless? Of course its speculation.
> What in cognitive science, or anything else outside the
> self-contained world of mathematics isn't?
> A little sensible speculation is needed to move the debate forward.
> Otherwise it will quickly get boring.
> Do you disagree with my speculation? If so why?
It is adding extra, arbitrary requirements on what it takes to be a
system that feels. On the face of it, to be a system that feels, all a
system needs to do is feel (whatever that is), just as you or I do when
we feel a headache or toothache. You say that's not enough: There has to
be a "sense of self". A worm can't feel a headache (if it has a head!)
unless it has a "sense of self." Sure sensible speculation is welcome,
but this one seems arbitrary and unnecessary. What on earth is a "sense
of self," and why does a worm need one to feel a headache?
Or, to put it another way: Why a "sense of self" rather just sense of
anything? Sensing, after all, is just feeling!
> > Hudson
> > I say awareness starts with self-awareness but by self-awareness I don't
> > mean the ability to reflect on ones own existence, I mean the 'sense' of
> > existence, and the existence of an identity.
> > But I don't understand what you mean. Can you translate it into the
> > language of feeling -- but without making any extra speculative
> > assumptions?
> What do you mean by 'the language of feeling'?
What is "awareness" is it doesn't mean "feeling something"? And what is
"self-awareness" if it doesn't mean "feeling something (else)"?
> > Hudson
> > I agree this feeling ability is what gives us minds but not what makes us
> > intelligent.
> > How about:
> > T3 capacity + feeling
> > vs.
> > T3 capacity - feeling
> > ?
> T3 capacity may not have feelings in the first place to subtract.
The notation was ambiguous. I meant:
(1) T3 capacity WITH feelings
(2) T3 capacity WITHOUT feelings.
I'm suggesting that when we get right down to it, (1) is what we mean by
"real" intelligence and (2) is simply what it is: a behavioural
capacity, T3. All we can do is HOPE that you can't really have full T3
capacity without feelings, hence real intellgence (thinking, mind).
This archive was generated by hypermail 2.1.4 : Tue Sep 24 2002 - 18:37:25 BST