Re: Chalmers: Computational Foundation

From: Hudson Joe (
Date: Mon Mar 26 2001 - 19:15:40 BST

> Hudson
> Well if there is a performance requirement in real-time functionality
> then its not implementation independent is it? If it is not then it
> can't be pure computation can it?

> Actually, others have singled out real-time processing constraints as
> implying that cognition may not be implementation-independent
> computation. But that just sounds like it's waiting for more speed and
> capacity, rather than something different from computation itself.

Right, so if speed is relevent than at the very least cognition is not
computation alone, which by definition is implementation independent,
which would make speed irelevent. I don't see why this is a dificult point
to accept.
> Hudson
> I wasn't talking about THOUGHTS of the "I am feeling X" kind. I was
> refering to FEELINGS that can only be described using words as "I am
> feeling X". I quite agree being able to reflect on being able to reflect,
> or on a particular feeling is unnecessary for something to feel.
> If a pinch causes pain to a worm then a pinch means pain to a worm. The
> meaning IS the feeling in this case.

> I can't quite follow your point. A worm feels, right? If we could get a
> computer to feel too, a lot of our problems (about whether it's really
> thinking, really intelligent, really has a mind) would be solved.
We would need to understand how we got the computer to feel first
(so, which bits of the computer and program are essential)! But before we
could even know that we would need to be able to verify the feeling
capacity as required. But this would have to be done without becoming
the computers mind, so we would need to know how we get a computer to
feel. Catch 22. You can't do one without the other.
Its the other minds problem again.

But this is besides my point. I'm trying to show intuitively that some
sort of self identity is essential for something to feel. Due to the
other minds problem this has to be done intuitively.

> So, in a nutshell, the answer to the question "Is the computer really
> thinking, or just performing as if it were thinking?" is: "If it's
> feeling, it's thinking; if not, not."

When was this ever in question?

> Hudson
> There needs to be a 'sense' (not
> necessarily a reflective thought!) of self for a creature to feel.
> The creature must associate the feeling with itself (again not
> necessarily at the "cogito ergo sum" level) or rather the creature's
> experience of feeling is fused with its identity (conscious or not).

> I don't know what all this means. It seems to me all a system needs in
> order to feel is to feel! All these other conditions sound like
> speculation...

You've lost me. So unless a discussion is backed-up with expirimental
evidence or theoretical proofs its pointless? Of corse its speculation.
What in cognitive science, or anything else outside the
self-contained world of mathematics isn't?

A little sensible speculation is needed to move the debate forward.
Otherwise it will quickly get boring.

Do you disagree with my speculation? If so why?

> Hudson
> I say awareness starts with self-awareness but by self-awareness I don't
> mean the ability to reflect on ones own existence, I mean the 'sense' of
> existence, and the existence of an identity.

> But I don't understand what you mean. Can you translate it into the
> language of feeling -- but without making any extra speculative
> assumptions?

What do you mean by 'the language of feeling'?

> Hudson
> I agree this feeling ability is what gives us minds but not what makes us
> intelligent.

> How about:
> T3 capacity + feeling
> vs.
> T3 capacity - feeling
> ?

T3 capacity may not have feelings in the first place to subtract.

This archive was generated by hypermail 2.1.4 : Tue Sep 24 2002 - 18:37:25 BST