From: Salcedo Afonso (afonso@mac.com)
Date: Fri Feb 23 2001 - 05:26:08 GMT
SEARLE:
> 2. Is the mind a computer program?
> 3. Can the operations of the brain be simulated on a digital computer?
> I think 2 can be decisively answered in the negative. Since programs are
> defined purely formally or syntactically and since minds have an intrinsic
> mental content, it follows immediately that the program by itself cannot
> constitute the mind.
Salcedo:
I think we can only decisively state that the mind is not a computer
program if we consider the definition of program as it is now known
today: a group of commands, symbols, that have a defined meaning and
stand for some kind of operations on the computer.
Human beings do have this capacity, but we have always to consider that
there is also a mental content, a consciousness, or whatever it might
be called. But what if the mind itself, the thing we call
consciousness, is nothing but reactions to some more complex set of
procedures or operations, that can actually be defined formally and
described in some analogous way to computer programs, syntactically?
Then, the mind could be considered a computer program and computers
would be able to evolve to a true state of artificial intelligence.
Artificial because it would still be organic-based and intelligence for it
would mimic the functioning of the brain/mind.
SEARLE:
> So the answer to the second question is obviously "No".
Salcedo:
It is obviously No as we currently dont yet understand how the brain
actually works. Scientific theory keeps evolving and correcting itself,
and this is not a proven statement.
SEARLE:
> The answer to 3. seems to me equally obviously "Yes", at least on a natural
> interpretation.
Salcedo:
We have to be careful when considering question 3. When asking if we
can actually simulate brain operations on a digital computer, we have
to remind ourselves that a simulation is not by itself a complete
representation of what it simulates.
The brain performs mathematical operations, which can be obviously
defined on a digital computer with no difficulty at all.
SEARLE:
> At some level of description brain processes are syntactical; there are
> so to speak, "sentences in the head".
Salcedo:
If at that level of description we can get a syntactical representation
of the process, then by Churchs we can define them on a digital
computer. And as such, the brain itself would be a digital computer.
We would need to understand how they are semantically related and why they
mean what they do, how and what do they actually mean.
SEARLE:
> It is furthermore a consequence of the Church - Turing thesis and Turing's
> theorem that anything a human can do algorithmically can be done on a
> Universal Turing Machine.
> Now it seems reasonable to suppose there might also be a whole lot of mental
> processes going on in my brain nonconsciously which are also computational.
Salcedo:
But what is consciousness? How can we distinguish something that is
conscious to me to one that it is not? A thought is itself a conscious
process, because its part of reasoning or is it unconscious because it
happens naturally? No-one ever pre-defines what thought he/she is going
to have. If doing it, they would be processing a thought as well.
What matters here is which of these can be algorithmically defined. If
assuming that both could be described as an algorithm, they would then be
computational and thus we could simulate them on a digital computer.
SEARLE:
> We try to discover the programs being implemented in the brain by programming
> computers to implement the same programs. We do this in turn by getting the
> mechanical computer to match the performance of the human computer (i.e. To
> pass the Turing Test) and then getting the psychologists to look for evidence
> that the internal processes are the same in the two types of computer.
Salcedo:
How can we actually compare internal representations between something
that, for now, is completely nonconscious as a computer program, and
something that can either be done consciously/unconsciously by a human
brain?
If different people think in different ways, and therefore have
completely different outcomes even while having exactly the same
internal processes, how can we actually say that the computer mirrors
the brain computer?
Can we actually generalise the human brain to one human brain as we can
with a mechanical computer? If we took several mechanical computers
that were found to have the same internal processes as the human brain,
would we then get different results from the mechanical computers as we
would if checking what different people would think or do in the same
situation/input?
Would the computers ³show² different personalities, different ways of
thinking?
SEARLE:
> The idea is that unless you believe in the existence of immortal Cartesian
> souls, you must believe that the brain is a computer.
Salcedo:
So what if I believe in the existence of a soul and that mere belief
could be possibly transcribed algorithmically to a set of computational
processes?
Then a mechanical computer that would mirror these computational
processes would believe he had a soul?
A belief doesnt mean it is the truth. So would a computer be fooled
into thinking he had a soul, just as I was?
SEARLE:
> Computational states are not discovered within the physics, they are assigned
> to the physics.
Salcedo:
Computational states are thus not intrinsic to the physics. This is why
this is so important to the discussion whether or not the brain
processes are computational. They are not said to be computational
because of the physical and chemical processes that occur in the brain,
but only because they are actually equivalent to a simple symbol
manipulation, be it neurons changing states or 1s and 0s manipulation.
It is completely relative to the observer who assigns the process as
being computational.
SEARLE:
> There are a whole lot of symbols being manipulated in the brain, 0's and 1's
> flashing through the brain at lightning speed and invisible not only to the
> naked eye but even to the most powerful electron microscope, and it is these
> which cause cognition.
Salcedo:
Again, the 0s and 1s dont have a physical existence. If this is the
human computer program, then isnt mind non-physical? What is and what
is not meta-physical? If the mind cant be physically described, can it
actually be considered computational?
The thing is, what we think can only be known to us and cant ever be
fully transcribed to a physical existence. Why do we think? It just
happened. Its intrinsic to our nature. It belongs to us. But how? How
can it be intrinsic to our characteristic, if at the same time it
doesnt belong to its physical nature?
This is by far the main debate between science and philosophy.
Philosophers believe in the meta-physical mind, something that belongs
to us but doesnt at the same time. Something intrinsic to us, because
we are born with it, but at the same time extrinsic to us, because its
not intrinsically written down to other humans.
SEARLE:
> So the puzzle is, how do we reconcile the fact that syntax, as such, has no
> causal powers with the fact that we do give causal explanations that appeal to
> programs?
Salcedo:
We give causal explanations that appeal to programs because we expect
some specific behavior just as we know the rules and the instructions
that define the program. We know that if we press button A we get
result B, and that is the only causal explanation we can give to plain
syntax programs and their processes.
If trying to mirror the brain and build a mechanical computer we might
causally deduce after proper testing that our model is actually
mirroring the brain function. The trouble with this, and I agree with
Searle, is that this would never be acceptable if for that function, we
actually understand the biological processes that are inherit to that
operation.
SEARLE:
> In the case of cognition the pattern is at much too high a level of
> abstraction to explain such concrete mental (and therefore physical) events as
> the occurrence of a visual perception or the understanding of a sentence.
Salcedo:
In my opinion, the only pattern one could associate with brain
processes in a human being, thus helping predict action/reaction pairs
would be the personality concept. A person tends to react in the same
way that is defined by its personality. But what exactly is
personality? There is no way to physically describe it and, as such, to
explain the events that are processed in the brain. And were back in
the same hole.
SEARLE:
> So how do we get computation into the brain without a homunculus?
Salcedo:
With a mechanical computer well always have the homunculus (a
diminutive human being without any form of physiology), which could be
a regular computer user. He would be giving his own interpretation of
what the computer would output. When the user leaves, the computer is
left as nothing else but an electronic circuit. What if the user
leaves, the computer continues to perform the process asked and days
later, the same user comes back to interpret the result. That would be
possible and the computer would still be able to perform its
calculations without the homunculus.
When considering the brain, who is the homunculus? Its obviously the
owner of the brain. If we now consider a brain without a homunculus
what would it be? How can it still compute? Would it compute? The thing
is that physically we cannot envision a brain working without a person,
or even, a person being alive without a human brain. But we do not know
what happens if the brain is considered a separate entity.
SEARLE:
> The upshot of this part of the discussion is that in the sense of
> "information" used in cognitive science it is simply false to say that the
> brain is an information processing device.
Salcedo:
I agree with Searle on this 100%. In my opinion, considering the brain
to be just an information processing device, would be comparing it to
the information processing that goes on in a mechanical computer
system. The brain is completely interconnected to all other human
senses at the same time, and as Searle describes it is not only a
biological set of processes, but it also has intrinsic characteristics
and processes that happen both consciously and unconsciously at the
same time.
Again, calling the brain an information processing unit would be a way
too large abstraction, and wouldnt help at all in understanding how
brain functions can be correctly modelled on a mechanical computer.
Afonso Salcedo < >
This archive was generated by hypermail 2.1.4 : Tue Sep 24 2002 - 18:37:16 BST