> > From: "Darren Snellgrove" <D.P.Snellgrove@soton.ac.uk>
> > 3rd Year Philosophy Student
Dear Stevan, sorry to butt in, but just a few points regarding the
following.......
> The identity theorist can SAY it, but what can it MEAN to say mind and
> brain are the same? That a toothache is the same as the activity in
> certain parts of my nervous system? I'll take your word for it, but that
> sure doesn't do the trick for me in the way telling me that heat is the
> same as average molecular energy, or water is the same as H2O does.
>
> Why not? The main reason is the Turing Test: The pain system in the
> brain has a function: It signals injury and triggers behaviours that
> protect against injury (I simplify, but the same would would be true
> if I told a fuller story of nociception, adaptation and learning). Those
> behaviours can be described and understood functionally, as can the pain
> mechanisms that generate them. But once you want to say: "And that's it;
> that's all there is to it; feeling pain is simply identical with the
> activity of the system I've just described," there always remains that
> niggling question: "But why can't you have all those FUNCTIONS [injury
> avoidance, etc.], so admirably generated by this system, WITHOUT FEELING
> ANYTHING AT ALL, indeed, without anyone being at home in the there?"
> What would be DIFFERENT about a Turing-Indistinguishable system that was
> just a Zombie?
I agree with what you have said up to the point where you suggest that
there must be something more; it seems to contradict your argument. I
believe you are referring to the notion of consciousness. The fact that
there seems to be a gap between neurological descriptions of what
happens when, for example, I look at the computer, and my actual
experience of seeing the computer. But why does this gap have to
exist? Doesn't the gap exist because the neurological description is
not attempting to describe my "seeing" of the page, but rather it is
defining the processes which occur when I see the page... there is no
gap. Your experience of water is not the same as the definitional
description provided by H2O.
With reference to the turing machines, I fail to see why an artificial
system of some form cannot have "experience". It does not have to be
just a Zombie. Furthermore, Functionalism may serve well in explaining
input/output relationships but I do not see how it can provide any
assistance to the mind/brain problem, indeed functionalism is perfectly
compatible with just about every other theory of the mind.
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:15 GMT