MAKING SENSE OF
CAUSAL INTERACTIONS BETWEEN CONSCIOUSNESS AND BRAIN.
Journal of Consciousness Studies, 9(11), 2002, pp. 69-95.
Max
Velmans, Department of Psychology, Goldsmiths, University of London, New Cross,
London SE14 6NW, England.
ABSTRACT. My target article (henceforth referred to as TA) presents
evidence for causal interactions between consciousness and brain and some
standard ways of accounting for this evidence in clinical practice and
neuropsychological theory. I also point
out some of the problems of understanding such causal interactions that are not
addressed by standard explanations.
Most of the residual problems have to do with how to cross the
“explanatory gap” from consciousness to brain. I then list some of the reasons
why the route across this gap suggested by physicalism won't work, in spite of
its current popularity in consciousness studies. My own suggested route across
the explanatory gap is more subterranean, where consciousness and brain can be
seen to be dual aspects of a unifying, psychophysical mind. Some of the steps on this deeper route still
have to be filled in by empirical research. But (as far as I can judge) there
are no gaps that cannot be filled—just a different way of understanding
consciousness, mind, brain and their causal interaction, with some interesting
consequences for our understanding of free will. The commentaries on TA examined many aspects of my thesis viewed
from both Western and Eastern perspectives. This reply focuses on how
dual-aspect monism compares with currently popular alternatives such as
“nonreductive physicalism”, clarifies my own approach, and reconsiders how well
this addresses the “hard” problems of consciousness. We re-examine how conscious experiences relate to their
physical/functional correlates and
whether useful analogies can be drawn with other, physical relationships that
appear to have dual-aspects. We also
examine some fundamental differences between Western and Eastern thought about
whether the existence of the physical world or the existence of consciousness
can be taken for granted (with consequential differences about which of these
is “hard” to understand). I then suggest a form of dual-aspect Reflexive Monism
that might provide a path between these ancient intellectual traditions that is
consistent with science and with common sense.
I would like to thank the commentators on TA for their many
excellent commentaries. To simplify the process of relating these commentaries
and my replies to the original text, I will deal with them according to topic
in the sequence that these topics are treated in TA. At various points I refer
to more detailed treatments of the issues addressed in my recent book Understanding Consciousness (henceforth
referred to as UC). A few of the commentaries elaborate on TA and do not
require a response. John Kihlstrom for example gives an
excellent review of the evidence for the causal effects of consciousness on
body/brain, the explanandum of TA, and Todd
Feinberg outlines an independently arrived at but similar understanding of
consciousness/brain interactions to my own.
Of necessity, the bulk of my response is reserved for those who
challenge aspects of my analysis, seek clarification, or defend alternative
analyses.
Can reductive
physicalism be defended?
The causal interactions between consciousness and brain could be easily explained, at least in principle, if consciousness were nothing more than a state of the brain. But, in the Appendix to TA I have given some of my reasons for doubting that science will ever demonstrate human phenomenal consciousness (C) to be nothing more than a state of the brain (B). I accept the widely held view that for any given a given conscious experience there will be associated physical causes and correlates within the brain. However, causation and correlation are not ontological identity. Identity is symmetrical (If B=C, then C=B) and obeys Leibniz's law (if B=C, then all the properties of B must also be the properties of C and vice-versa). Correlation is symmetrical (if B correlates with C, then C correlates with B) but it does not obey Leibniz's Law. Causation is neither symmetrical nor obeys Leibniz's Law. Why does this matter? Suppose that third-person science had established a perfect 1:1 correlation between a given experience C and its physical correlates B. Wouldn't that suffice to establish a reductive identity between them? No, because major differences between the first-person phenomenal properties of C and the third-person physical properties of B would remain.[1] C and its correlate B would of course be intimately related. But an “intimate relationship” need not be a reductive identity. According to dual-aspect theory, B and C are complementary aspects of mind-itself. This would explain both their perfect correlation and their phenomenal differences. So it would provide a better fit to the available facts.
In defence of a “diffident physicalism” Torrance argues that if C and B were identical, that would also explain their correlation and “given the problems inherent in competing theories, asserting an identity relation seems reasonable, in the absence of a better alternative.” Given its better fit to the available facts, I would argue that dual-aspect theory does provide a better alternative and that its so-called problems are not problems at all (see below). I suspect that, on reflection, Torrance would agree that if physicalism can only establish consciousness/brain identity by assuming it, then it is on very weak ground indeed.
Can nonreductive
physicalism be defended?
Of the physicalist theories, those of the emergent variety are perhaps the most plausible. It is obvious that higher-order physical properties emerge from the brain's micro-operations. In a sense, conscious experiences also emerge from the brain (in the sense that brain states can be said to cause or correlate with conscious experiences). I nevertheless resist the view that conscious experiences are just higher-order physical states of the brain. Higher order physical states of the brain are likely to correlate with given conscious experiences, but this does not warrant a reduction of the experiences to their correlates for the reasons outlined in the section above. Torrance suggests that it is a little unfair to tar the emergentists with the same brush as the reductionists. Emergentists accept that what emerges may be toto mundo distinct from the processes that have produced them. I agree, and I also agree that there is no problem about treating higher-order emergent properties of the brain as physical if, on commonly accepted criteria, they are physical (e.g. the electrical and magnetic fields detected by EEG and MEG). On the other hand, if, on commonly accepted criteria, conscious experiences would be categorised as mental as opposed to physical, then calling them “physical” becomes a mere relabelling exercise. To given this exercise bite, one would have to show that all the properties that that are normally thought of as “mental” (first-person properties) are in fact physical (third-person properties) otherwise one is simply left with all the problems of why “what it is like to be something” should emerge from or have a causal influence on the physical world. Torrance argues that a weak or wide form of physicalism might nevertheless be coherent and at the very least, tenable (in spite of there being no strong case for it). That may be so. But a physicalism that weak and wide would be empty. All the puzzles of consciousness would simply slip through its net.
In defence, Torrance suggests that there are other reasons “to do with ontological economy, conceptual conservatism, causal closure, and so on, against introducing non-physical properties into the universe.” I accept that ontological economy (simplicity) is desirable, but it has to be balanced against sufficiency, and I would argue that ontological monism combined with epistemological dualism achieves that fine balance. I don't agree that conceptual conservatism should be the order of the day when we are faced with a theoretical orthodoxy about the nature of conscious experience that is so clearly at odds with our actual conscious experience. In any case, the explanatory system in UC does have causal closure as far as I can tell. Crucially, I am not “introducing non-physical properties into the universe.” I am merely suggesting another way to make sense of the phenomenal properties that are observed to be there.
Can nonreductive
physicalism be defended against my three threats to the third-person causal
status of consciousness?
1. We lack
conscious knowledge of the details of the processes that we are supposed to
consciously control. According to Van Gulick this does not trouble
nonreductive physicalism. As he notes, we often need very little knowledge of a
process and of its detailed workings in order to affect, control or initiate
it. Use of a computer for example does
not require one to know anything about the underlying structure of the
operations that execute its commands in machine language. Yet the high level control that we exercise
is surely conscious. I agree—and I have used a similar argument to defend my own
dual-aspect analysis of causality in TA (p x).
However, I think we need to be precise about the sense in which such “conscious control” can be “conscious.” Control
can be conscious in the sense that we are conscious of exercising control in this high level (global) way, and for
everyday purposes this experience of being in control is veridical (when we
think we have voluntary control, we usually do – see discussion of free will in
TA). Controlled processing can also be conscious in the sense that it results in a conscious experience. The
critical issue, however, is whether first-person conscious phenomenology actually controls or enters into physical processing, which seems
to contravene the principle that the physical world is causally closed.
2. The problem of
causal closure. If first-person
experiences are invariably accompanied by distinct physical correlates, and if
the physical world is causally closed, I don’t see how such experiences could exercise causal control—as the
relevant control would already be exercised by their physical correlates (see
the problem of “overdetermination” raised in the commentary by Chrisley & Sloman). The only escape route for “nonreductive
physicalism” is to argue that, one day, science will establish conscious
experiences to be nothing more than their physical correlates. As I have argued
in the Appendix of TA, however, third-person science is restricted to
establishing the neural causes and correlates of experiences. Given that causes and correlates are not
identities, this scientific route to establishing a reductive identity is
blocked.[2]
3. Consciousness
comes too late to affect the processes to which it most closely relates. Scientific claims for the causal efficacy of consciousness
are typically based on contrastive analysis. Psychologists commonly contrast
preconscious or nonconscious processing of a given type (e.g. preconscious
visual processing of input) with conscious processing of the same type (e.g.
visual processing where one becomes conscious of input) and then attribute any
functional differences between these to the operation of consciousness. In
Velmans (1991) I pointed out the fallacy of such attributions. Conscious experiences are a consequence of
sophisticated, focal-attentive processing and without focal-attentive
processing many forms of complex, novel functioning would not occur. However, the experiences themselves emerge
too late to affect that processing. Van
Gulick asks “But why should this impugn the commonly accepted causal status
of consciousness? Neither folk psychology nor any scientific model of
consciousness of which I know supposes that experiences produce the very
processing from which they themselves result.” This misses my point. Once
it is pointed out that consciousness results from a given process, of
course no sensible person would claim it to have a causal role in that
process. However, as far as I know,
prior to my 1991 BBS target article,
no one had pointed this out, claims for the causal efficacy of consciousness
based solely on nonconscious/conscious contrasts were legion, and they
continue, unabated to this day (see e.g. Baars, Banks & Newman, 2002). I
accept, of course, that consciousness might in principle enter into processing
that follows its emergence. However, this proposal still has to surmount
the problem that the physical world is causally closed (see TA, Note 17 - and
discussion above).
Van Gulick asks whether my own view is just a variant of a form of
nonreductive physicalism that accepts explanatory pluralism. This would allow higher-level forms of
mental organisation to have properties that are best described at that level
rather than in the more basic terms of physics, without doubting that such
properties are ultimately physical. This applies, for example, to economics. No
one is worried by a money/matter problem or seriously advocates money/matter
dualism. So why could the same not be true of mind? Van Gulick also points out that my epistemology in TA is presented as first- versus third-person dualism rather than pluralism although he notes that this may well be more a
matter of exposition rather than substantive disagreement.
As it
happens, these issues interconnect.
Rather than contrasting epistemological
dualism with epistemological pluralism, in my own work I combine these. That is, I
support the view that there are many forms and levels of explanation within both first- and third-person accounts.
As Van Gulick notes, pluralism is
commonplace in third-person science, with well-defined hierarchies defined by
the size and level of organisation of the phenomena, typically ranging from
microphysical, macrophysical, chemical, biological, neurophysiological,
cognitive/functional, and social levels of organisation. Nonreductive
physicalism usually identifies conscious experience with just one level of this
hierarchy, typically the cognitive/functional level that, in turn, supervenes
on the brain's neurophysiology. In my view this is an oversimplification. Like
the forms of material organisation that they accompany, first-person
experiences can be described at different levels and may have an ontology at
different levels. Some experiences
appear to be socially determined, being, in part, social constructions grounded
in culture and history (e.g. what it is like to be the Prince of Wales). Others (such as empathy) are
quintessentially interpersonal, requiring the presence of at least one other
human being (see readings in Thomson, 2001 Between
Ourselves). Yet others, such as visual and auditory percepts appear to be
largely individual, resulting from the binding of sensory features within an
integrated human brain. Under appropriately controlled conditions these
features can be further decomposed into the minimally discriminable phenomenal
differences studied by psychophysics and so on.[3]
It is
important to note that so-called “nonreductive physicalism” is reductive in that it claims conscious
experiences to be nothing more than a
form of higher order material organisation or cognitive functioning. While
dual-aspect theory accepts that every distinct experience has a distinct set of
third-person physical and/or functional correlates
(at social, personal or subpersonal levels of organisation) it resists the
physicalist suggestion that such conscious experiences are reducible to their
correlates at any of these
levels. Rather, first person accounts
of experience and third person accounts of accompanying physical functioning
remain complementary and mutually irreducible, whatever the level of organisation.
This
brings us to Van Gulick's Point
2. Do I equate the mental perspective
with the first-person perspective? No.
Like nonreductive physicalism, I treat the first person perspective(s) as a
subset of a larger set of mental perspectives, some of which are entirely third
person in nature, for example, those aspects of mental functioning described in
cognitive psychological accounts of the mind.[4] Unlike nonreductive physicalism, however, I
argue that mental processes that have a conscious phenomenology cannot be exhaustively described in third person
terms. While it is possible to describe
what people do or how their brains
function when they have beliefs, desires, etc., in third-person terms, without
reference to their first-person perspective it is not possible to describe what
they experience. I have given some of
my initial reasons for the irreducibility of first- to third-person accounts in
the Appendix to TA. As Van Gulick
does not take issue with this preliminary analysis I will not offer a deeper
defence of it here. Interested readers
will find a far more extensive analysis in chapters 3, 4 and 5 of UC (see also
my debate with Dan Dennett in Velmans, 2001).
This brings us conveniently to Van Gulick's point 3. As he notes, I
share the physicalist commitment to ontological
monism, but my dual-aspect view takes the ur reality (the nature of mind in
this case) to be neither physical nor mental.
Why? Precisely because it has both of these mutually irreducible,
first- and third-person aspects. Viewed
from the outside, the operations of ur mind appear to be operations of
brain. Viewed from a first-person
perspective, the operations of ur mind appear to be conscious experiences.
Which is it really? If one assumes (as
I do) that neither perspective is
necessarily illusory or deluded, then the nature of ur mind must support both the views that we have of it. Given this, its nature is better described
as “psychophysical” than “physical.”
This also addresses Torrance's
suggestion that my position is not really all that different from
nonreductive physicalism. He asks, “Doesn’t
monism imply unity? So are you not saying that the neuroscientist’s third
person facts and the subjective first-person facts are two equally real parts
of a single unity? But then, if one side of this unity is physical, mustn’t the
other side also be physical (or it’s not a unity)?” He then guesses correctly that,
“Perhaps Velmans’ answer to this is that neither the third-person physical
facts nor the first-person subjective facts are ultimately real, and that the
underlying bedrock of reality is neither the one nor the other. (I guess this
is implied by his calling it a ‘dual-aspect’ theory.)”
Viewing the mind-itself as psychophysical rather than
physical is more than a simple relabelling exercise. What does this form of
dual-aspect monism buy us? As I have argued in UC, “If consciousness and
its physical correlates are actually complementary aspects of a psychophysical
mind, we can close the “explanatory gap”
in a way that unifies consciousness and brain while preserving the
ontological status of both. It also
provides a simple way of making sense of all four forms of physical (P) and
mental (M) causal interaction.
Operations of mind viewed from a purely external observer’s perspective
(P®P), operations
of mind viewed from a purely first-person perspective (M®M), and
mixed-perspective accounts involving perspectival switching (P®M; M®P) can be
understood as different views (or a mix of views) of a single, psychophysical
information process, developing over time.
In providing a common psychophysical ground for brain and experience,
such a process also provides the “missing link” required to explain
psychosomatic effects.” (UC, p 251) (see also TA page x)[5]
Escape from the problem of causal closure. In Van Gulick's words “If the physical factors revealed from the third person
perspective give a complete causal explanation of physical events and nothing
nonphysical can have a causal impact on the physical, then there does not seem
to be any room for other factors viewed from an alternative perspective to act
as causes of physical events.” I
agree. But we are not interested here
in purely physical events. We are
interested in the nature of mind, and according to the above, the nature of
mind is psychophysical. Unlike “nonreductive physicalism” this
analysis of mind and M®P causation is
genuinely nonreductive. And it is this that makes it immune to Kim's (1999) point that, if the physical world is causally
closed, either the mental reduces to the physical, or it must be epiphenomenal.
Unlike “nonreductive physicalism” I do not claim that first-person experiences
somehow enter into third-person physical functioning, so I do not need to
reduce these experiences to physical events to make good that claim. Within dual-aspect theory there is a more
intuitively plausible option. If the
mind is genuinely psychophysical, then an entirely third-person physical view
of it gives only a partial view of both its nature and its causal operations.
Brain states are genuine phenomena (manifestations of ur mind), viewable from a
third-person perspective, but conscious experiences are also genuine phenomena
(manifestations of ur mind) viewable from a first-person perspective.
Descriptions of brain states can be used to give a detailed account of the
operations of mind in terms of its physical manifestations. Descriptions of
first-person experience can be used to give an account of the operations of
mind in terms of its conscious manifestations. For scientific purposes,
third-person accounts are more useful. For everyday purposes, first-person
accounts are often more useful. Both are required for an account of mind to be
complete.[6]
In what sense does this complementary perspectives account advance our understanding of the “hard problems” of consciousness?
In UC and prior work I have argued that some of the problems of consciousness require conceptual advance, some require empirical advance, and some require both. Empirical questions include, “What are the necessary and sufficient conditions for consciousness in the human brain?” and “What are the neural correlates of consciousness?” Questions such as “What is consciousness?” “What is the function of consciousness?” and “How can one make sense of the causal interactions of consciousness and brain?” appear to be largely conceptual. Why? Because the considerable evidence that one can already bring to bear on these questions somehow fails to address them. Each and every one of us has a vast reservoir of conscious experience. Gathering more of it won’t clarify what consciousness is. Extensive contrastive analyses of “conscious” versus “nonconscious” processing have already been carried out in psychological science. These illustrate functional differences between processes that either are, or are not accompanied by phenomenal consciousness. But such contrasts don’t reveal what the conscious phenomenology itself does (see TA). Nor does the massive evidence for mind/body interactions reveal how to make sense of the causal interactions of consciousness and brain. It is the opacity of these questions to further data gathering that makes them “hard.”
The present TA deals only with the last of these “hard problems” (I deal with the other problems and with how they all interconnect in UC). According to Gray, however, these problems are tangential to the real hard problem. Rather, “the Hard Problem can be stripped down to just two questions: how does the brain create qualia; and how does the brain inspect them?” It will be apparent that I do not entirely agree. These two questions are two of many. As it happens, they have third-person aspects that are fully amenable to empirical research. One proper answer to the question “How does the brain produce qualia?” would be to specify the necessary and sufficient physical conditions in the brain for the appearance of conscious qualia. This can be investigated by contrasting physical conditions that are necessary and sufficient for the appearance of qualia with those that are not—a standard method in science. Viewed in third-person terms, “how the brain inspects qualia” can be explained in terms of “how it inspects representations at the focus of attention”, “make them available for report”, and so on.
One might object, of course, that such third-person accounts don't answer the right question. The hard problem is not how one part of the brain might inspect and report on information in another part of the brain. It is how a physical brain could “inspect” a first-person conscious experience! But this is precisely the question that I do address! Ontological monism combined with epistemological dualism makes it clear that one can give a “pure third person account” of brain functioning (in terms of how subsystems in the brain inspect representations at the focus of attention). One can also give a “pure first-person account” of what is going on (in terms of the way that I, a conscious being, can inspect my own conscious experience). An account of “how the brain inspects its conscious experience” can then be seen to be a “mixed-perspective account, involving perspectival switching” (see TA). As Gray notes, “the only satisfying explanation will be one that shows how consciousness is linked to the scientific account that applies to the rest of that world.” In broad terms, that is what this analysis achieves.
Gray objects that, “The standard criterion for whether or not a proposed theory forms part of science is potential falsifiability by empirical observation. I cannot think of any such test of Velmans’ model, nor has he proposed any himself. The same is true, so far as I know, of all other versions of dual-aspect theory, including for example Chalmers' (1996) recent attempt to seek a common basis for the physical and conscious realms in an underlying stuff of ‘information’ (a move Velmans also makes, in his section on ‘the neural correlates of conscious experience’). Thus, Velmans’ proposed solution to the Hard Problem is purely philosophical, which is to say, purely verbal. It purports to tell us what we ‘really’ mean when we say things, respectively, from first-and third-person points of view. We need to move beyond this.” (p x) Rao also makes a similar complaint, when he suggests that I try to “resolve ontological issues by reducing them to epistemological ones.” (p x)
I accept the point that some aspects of my analysis have to do with how to understand “what we ‘really’ mean when we say things, respectively, from first-and third-person points of view”, namely my account of perspectival switching, and mixed-perspective explanations. However the ontological monism, combined with epistemological dualism that underpins this analysis is not just linguistic philosophy. It is a claim about the basic ontology of mind, its manifestations, and about how we can know its nature and its manifestations. For example, the proposal that first- and third-person perspectives of the mind are complementary and mutually irreducible is a claim about mind and how we can know it in the same sense that wave-particle complementarity is a claim about the nature of light, or electromagnetic energy is a claim about the unified nature of electricity and magnetism.
I also accept that my global analysis of how to make sense of the causal relations of consciousness and brain is not a theory about the details of how information encoded in the brain is mapped into conscious phenomenology. However such details can, in principle, be settled by empirical research, which makes them so-called “easy” problems rather than “hard” ones. To make sense of the hard problems we need to think about them in a different way (perhaps in the way that I suggest). But that does not make my entire analysis unfalsifiable, and purely “verbal and philosophical.” [7] A quick revisiting of my case for dual-aspect theory and my critique of alternative theories will confirm that they are tightly linked to falsifiable, experimental and clinical evidence—to evidence of mind/body interactions (see Kihlstrom), to evidence that the physical world is causally closed, to evidence that phenomenal consciousness comes too late to affect the processes to which it most closely relates (Velmans, 1991), to evidence of preconscious and unconscious processing, and so on. Crucially, unlike the variants of physicalism and functionalism defended by Torrance, Van Gulick, and Chrisley & Sloman, the dual-aspect theory developed in TA also conforms closely to the evidence of first-person experience.
It is instructive to dwell on this last point. Although the dual-aspect analysis in TA, and the Reflexive Monism that underpins it in UC are broad theories about how to make sense of the relation of consciousness to brain rather than theories about the neurophenomenological details, they are first and foremost empirical theories that try to make sense of the combined third- and first-person evidence. If there were convincing third- or first-person evidence that challenges some aspect of these theories, or some clear flaw in how the analyses connect to the data, then I would modify or abandon the theories. Many experimental and theoretical developments could challenge the details of my analysis, for example current attempts to demonstrate that a cognitive unconscious does not exist (Perruchet & Vinter, 2003), or that “qualia” do not exist (see debate with Dennett in Velmans, 2001), or that the neural correlates of consciousness are not representational states that encode identical information (Chrisley & Sloman - see below), or that consciousness is actually a mental field that influences the activities of brain (Libet, in press). I would also abandon my “complementary perspectives” analysis if our everyday insights into the operations of our own minds based on our first-person experiences turned out to be largely wrong.
Compare this with physicalism. Physicalism draws its scientific respectability from its namesake “physics”. However physicalism is a philosophical thesis about the ontological nature of conscious experience, not a field of science. Its claim that first-person phenomenology reduces, without remainder, to states of the brain has no real evidence in its favour (neural causes and correlates are not identities), and massive evidence to the contrary (conscious phenomenology does not resemble brain states). This makes it basically a faith in the all-encompassing nature of third-person science—a commitment to a worldview that is immune to falsifying evidence. If one is looking for an unfalsifiable theory, here it is.
How conscious experiences relate to their physical/functional
correlates
At present, we know little about the physical nature
of the correlates of conscious experiences.
Nevertheless, in UC and TA I suggest that there are three plausible,
functional constraints imposed by the phenomenology of consciousness
itself. Normal human conscious experiences are representational (phenomenal
consciousness is always of something).
Given this, it is reasonable to assume that the neural correlates of such
experiences are also representational states. For a given physical state
to be the correlate of a given experience it is also plausible to assume that
it represents the same thing
(otherwise it would not be the correlate of that
experience). Finally, for a physical state to be the correlate of a given
experience, it is reasonable to suppose that it has the same “grain”.
That is, for
every discriminable attribute of experience there will be a distinct,
correlated, physical/functional state. As each experience and its physical
correlate represents the same thing it follows that each experience and its
physical correlate encodes the same information about that thing. That is, they are representations with the
same information structure.[8] I also point out that different representational systems employing
different formats can encode identical
information without themselves being identical. Neural correlates, for example, might function as representations (encoding identical information to that
displayed in their correlated phenomenology) without “mirroring” that conscious
phenomenology in any obvious physical sense. While such correlates might be iconic, they could also be
propositional, feature sets, prototypes, procedural, localised, distributed,
static, dynamic or whatever. The
operations on them might also be formal and computational, or more like the
patterns of shifting weights and probabilities that determine the activation
patterns in neural networks (TA, note 10). [9]
Given that I do give a supporting case for
this in TA (and a far more detailed case in UC pages 236-251) it is hard to
understand Chrisley & Sloman's contention
that I “move without argument, from the representational nature of experiences
to the existence of neural correlates of these experiences which have the same
representational content as these experiences.” Nor is it easy to make sense of
their claim that I am “either making a rather strong reductionist assumption,
or (worse) postulating a dubious causal connection (between the structure of
experience and the structure of neural states).” As I have noted, different representational systems can encode
identical information without the systems
reducing to each other—and the relation between experiences and their
physical correlates is, by definition, correlation not causation.
Would the discovery of psychophysical
correlations be scientifically useful?
As noted above, even perfect 1:1 correlations between conscious states and physical states would not establish their ontological identity. It is also well accepted in science that correlation does not establish causation. Consequently, even exact neurophenomenological laws that chart the way that given physical correlates map onto given conscious experiences would not be causal laws. If such bridging laws could be found they might nevertheless document invariant, empirical relationships in a precise way—and few, I suspect, would doubt that this would be a major scientific advance. Rakover, however, disagrees. According to him, “correlational laws” are not “natural laws” and cannot fulfil the requirements of measurement that are accepted in science. Consequently he thinks that neurophenomenological laws cannot be used to make sense of the causal relationships between consciousness and brain.
In assessing how Rakover’s commentary relates to my TA it is important to first note
that he does not actually address the
detailed account that I have given of how to make sense of the causal
relationships of consciousness and brain, nor of the way that potential
neurophenomenological laws might fit into such an account. However he does
offer a critique of the scientific status of neurophenomenological laws as such,
and of the use of “information” as a unit of measurement in psychology. As these are important elements of my
analysis I will confine my reply to these relevant aspects of his critique.
Second, although Rakover gives the
misleading impression at the beginning of his commentary that my use of
neurophenomenological laws is out of step with psychological practice in that
it does not conform to “rules of the game that are accepted by the natural
sciences and cognitive psychology” (p x), he goes on to admit at the end of his
commentary that my use of such correlational relationships is entirely
conventional within psychology and that his real target is psychology! As he concludes on page x, his critique
“could also be directed at psychology at large. In comparison with research in
the natural sciences, psychological research is limited and does not progress
like physics (see Rakover, 1992).”
What are these supposed limitations on
psychological research? According to Rakover,
neurophenomenological laws do not fulfil the requirement for the “unit
equivalency” found in natural laws such as Newton's law of gravity, where the
units of measurement found on both sides of the equation S = 1/2GT2 can
be shown to be equivalent. Let us suppose, for example, that neuropsychology
discovers the exact neural correlates of different subjective aspects of pain
phenomenology and manages to express its findings in neurophenomenological
laws. In such cases, Rakover asks, “Can it be shown that the
combination of the units of measurement on the right-hand side of the pain
equation is identical to the combination of the units on the left-hand side? To
the best of my knowledge the answer is no.” (p x) I agree. But such an absence
of unit equivalency provides yet another argument against reductive physicalism. It
has nothing to do with whether or not there are
distinct physical/functional correlates of distinct pain experiences, or with
whether or not it is possible to chart such relationships precisely in the form
of nonreductive neurophenomenological
laws that do not require “unit equivalency.”
But would the absence of “unit equivalency”
make neurophenomenological laws unscientific? Consider Rakover's doubts about studies of pain phenomenology. Pain is often presented as a paradigm case of a
private, subjective, mental event within philosophy of mind. There are many
ways to measure the subjective experience of pain[10],
but at the present time no valid “objective” measure of pain experience (in
terms of a physiological index) exists.
In spite of this, over the period 1960 to 2002, the Medline database lists around 200,000
publications on pain and its alleviation, making it a heavily investigated area
of medicine. According to Rakover, such studies are restricted by
the absence of “fundamental measurement units” of the kind that obtain, say,
for the measurement of length, which sustain properties such as transitivity
and additivity. While this is true, it
is hardly news to anyone trained in psychological research, where it is taken
for granted that whenever numbers are assigned to psychological variables these
must be scaled in a way that is appropriate to those variables (reaction time
and error rate merit a ratio scale, subjective judgements of magnitude
generally merit an ordinal scale, categorical judgements a nominal scale, and
so on). Once an appropriate scale is
assigned, numbers derived from measurements of behaviour or subjective
judgements can be subjected to appropriate statistical analyses, and the
results interpreted as supporting hypotheses (or not) in the normal way.[11]
It is true that few relationships between
physical and psychological variables have been found to be sufficiently general
and orderly to merit the term “law” and even these do not satisfy “unit equivalency.” Perhaps the best example is Stevens’ power law J = kIx where
J is the judged intensity of a
stimulus (e.g. its brightness or loudness), k is a scaling constant, I is the physical intensity (e.g. specified in
lumens or decibels), and x is a constant whose value depends on the modality of
the judged stimulus (e.g. for judged loudness, x=0.3). Stevens’ law charts how
variations in the physical stimuli are translated into judged changes in the
way those stimuli are experienced.
Consequently it is “correlational” in precisely the way that Rakover describes. Does this mean that
Stevens’ law is unscientific? No. There are countless examples in science where
Nature does not fit into the neat conceptual boxes that we have prepared for
her, and the psychological and biological sciences have long abandoned the view
that the only relationships of scientific interest are fundamental causal laws
of the kind found in physics.
Functional models in cognitive psychology and compositional accounts of
the structure of biological systems are obvious cases in point. In psychophysics, Stevens’ power law may not
satisfy unit equivalency, but it nevertheless expresses empirically verifiable
relationships between physical dimensions of stimuli and subjective judgements
about those stimuli in a precise, systematic way, and it is in that sense
unquestionably scientific. Given this,
it is reasonable to hope that in some future neuroscience it may be possible to
develop neurophenomenological laws with equivalent precision and generality.
Rakover also claims that information cannot be a unit of
psychological measurement. But again, few psychologists would agree. It is true
that, following George Miller's (1956) seminal paper The magical number 7 ± 2, psychologist have long accepted that
human mental processing is often too flexible and varied to be computable in
“bits” in the precise Shannon sense.
Nonetheless the psychological use of concepts derived from information
theory and/or the more general principles of information processing developed
within electrical engineering is ubiquitous—to the point that, in cognitive
psychology, mental processing is habitually referred to as human information processing.
In any case, my own use of the terms “information” and “information
structure” relate to a fairly precise use of these terms that is applicable in
psychophysics, for example in the study of difference limens (minimal
discriminable differences). Such
studies document whether or not physically measurable differences in stimuli
are translated or not (by sensory/perceptual processes) into consciously
perceived differences, that is whether or not information about physical
differences is translated into detectable changes in phenomenology.[12] In the same way it is possible to study whether
or not physical/functional differences in neural representations of stimuli are
translated into detectable changes in phenomenology. Physical/functional changes in neural representational states
that are translated may be said to be
of the same “grain” as the conscious phenomenology and to mirror its
information structure. Rakover
doubts that it would be possible to identify
such information bearing states, as one cannot remove one's dependence on
subjective reports of what is or is not experienced. I agree that one cannot remove subjective reports. However, the combination of subjective
reports with triangulating third-person observations of neural states is
standard practice in neuropsychology.
Investigation of the neural correlates of consciousness is technically
difficult, but the field of investigation is already very large (cf Metzinger,
2000). Rather than being questionable
science, it is unquestionably normal science.
Analogies
The ways in which different conscious experiences relate to their physical correlates have to be understood in their own terms. Some properties of these relationships appear to resemble ones that are already well understood in natural science, but, as far as one can tell, no other purely physical system provides an exact homology. Crucially, the relations of experiences to their physical correlates have to be understood in terms of how certain phenomena (the experiences) viewed from a first-person perspective relate to other phenomena (the correlated brain states) viewed from a third-person perspective. By contrast, all the properties of physical systems (conventionally understood) can be viewed from a third-person perspective.
Videotapes and TV screens. Sometimes,
however, analogies can help. For example, to understand how experiences and
their physical correlates might encode identical information without themselves
being identical it is useful
to know that such a dissociation between representations and representing
systems are commonplace in technology—as in my example of the play “Hamlet”
encoded on videotape or displayed on a screen. Given my limited intent, it is
hard to understand Chrisley &
Sloman's claim that this analogy “misfires.” As they correctly note, “this is not an ontological reduction”. However, according to them, “it is an epistemological one”, and
then they go on to claim that, “epistemological dualism is the only thing
separating Velmans from the physicalist positions he rejects.” But how can this
be?
Videotapes and TV screens encode
information in entirely different formats. Even when they encode information
about the same thing, they do so in two entirely different ways—which is
broadly analogous to knowing about one thing in two different ways. So in what sense is this an “epistemological
reduction”? Admittedly, there is one known, the nature of mind, with two
(material and phenomenal) aspects, by which it can be known. But, given that I
suggest the nature of mind to be “psychophysical,” in what sense is this
“physicalist”?[13]
Electricity
and magnetism. The
same information can be formatted differently, depending on the characteristics
of the representing system. If one can specify the different ways that given
information is formatted, then it should be possible, in principle, to specify
how those different formats map onto each other. In TA and UC I suggest that,
in some future neuropsychology it might be possible to specify how the
phenomenology of given consciousness experiences map onto to their physical
correlates in this way. This might
provide a dual-aspect account of the nature of mind in which the relationships
between its physical and phenomenal aspects were specified precisely, perhaps
with the precision that electrical current in a wire can be related to its
surrounding magnetic field.[14]
Chrisley & Sloman confusingly suggests that the duality that I have in mind with
the electromagnetism analogy, “is one of aspects, not of ontological
character.” What I actually suggest is that the phenomenal and physical aspects
of mind specify its (psychophysical)
ontological character.[15]
Even more confusingly they go on to write, “.. the analogy doesn’t work:
electricity and magnetism are not simply two ways of thinking about the same
phenomenon, but two different physical phenomena that can be related to each
other mathematically.” Given that I
never suggest that electricity and magnetism are simply two ways of thinking
about electromagnetism (rather than genuine aspects), nor that physical and
phenomenal aspects of mind are simply two ways of thinking about mind (rather
than genuine aspects) the relevance of this comment to my analysis is hard to
understand. They then add to the confusion by going on to write,
“In
contrast, and crucially, Velmans claims that the difference between first and
third person ways of thinking of psychophysical stuff is merely that of
differently formatted ways of representing the same information.” I claim nothing of the sort. As noted
above, first- and third-person (phenomenal and physical) aspects of mind are
not merely different “ways of thinking” about it. Being genuine phenomenal and physical aspects (or manifestations) of mind, they can in principle encode
the same information in different phenomenal and physical formats. Chrisley & Sloman go on to note
that “the electrical phenomenon is not just an aspect, a way of formatting the
same information as that represented by the magnetic way of looking at the
situation. There are situations where only the electrical description applies,
and other situations where only the magnetic description applies.” I
agree—although this again has nothing to do with my analysis of dual-aspect
monism or my use of the electricity/magnetism analogy.[16]
They go on to conclude that, “Prima facie, this suggests that there are two distinct
phenomena involved; to argue that there is actually only one, root phenomenon
will require further work from Velmans.”
Here I disagree. Electricity and
magnetism are indeed distinct phenomena, but the view that they are both
manifestations of only one root phenomenon (electromagnetism) is received
wisdom in physics. It requires no further work from me.[17]
Wave-particle complementarity.
In TA note 13 and UC I note that my dual-aspect analysis of mind also has some
interesting resemblances to wave-particle complementarity in quantum mechanics
– although, once again, the analogy is far from exact. Quanta either appear to
behave as electromagnetic waves or as particles depending on the observation
arrangements. And it does not make sense to claim that electromagnetic waves
really are particles (or vice versa).
A complete understanding of quanta requires both complementary descriptions. First-
and third-person observations of mind also depend on very different
observational arrangements, so that may help to explain why, from a
first-person perspective it takes the form of conscious phenomenology, whereas
viewed from the outside it appears to be a brain. Like wave-particle accounts
in quantum mechanics, phenomenal and physical accounts of the mind's operations
appear to be complementary and mutually irreducible. A complete account of mind
requires both.
Note
that these distinguishing features of dual-aspect monism contrast sharply with
competing analyses of the experience/brain state relationship. Substance
dualists maintain that experiences and correlated brain states are entirely
different “substances” or “entities”, idealists argue that all physical
entities (including brain states) are really forms of mind or consciousness (Rao),
and physicalists argue that experiences are nothing more than states of the
brain (Torrance, Van Gulick, Chrisley & Sloman). All these positions have well known problems. For example,
dualism splits the universe in a way that makes it difficult to get it together
again, idealism does not cope well with the apparent, autonomous existence of
the material world, and physicalism does not cope well with the phenomenology
of conscious experience. I have argued that dual-aspect monism allows one to
accommodate first- and third-person evidence in a more natural way that avoids
such problems. While the case for this above (and in Velmans, 1991, UC and TA)
does not rely in any way on analogies from other branches of science, the
parallel with wave-particle complementarity in quantum mechanics is suggestive.
However,
according to Chrisley & Sloman,
this analogy “is even worse”—although they take issue not with me, but with the
founders of quantum mechanics. They write, “More and more physicists and
philosophers take the appeal to complementarity as a reductio ad absurdum of particular ontological positions in quantum
mechanics. They do not deny the veracity of the data that have led some to
conclude that quanta have both wave and particle aspects; but they do deny that
the paradox of complementarity is a satisfying way of accounting for that data.
There are other, less paradoxical and thus more satisfying metaphysical
pictures on offer (e.g. Bohm, 1952; Hiley and Pylkkänen, 2001). To say that
your metaphysics of mind is akin to the wave/particle complementarity
metaphysics of quanta is just another way of saying that you don’t have a
satisfying metaphysics, and choose instead to ‘live with’ the paradoxes.” I
think that is being rather unfair to our colleagues in physics. The majority of
physicists are more concerned with whether the mathematics of QM accounts for
the data, and they think of (exclusive) complementarity as a current, best
description of the empirical findings, imposed by the limitations of
measurement, rather than “a reductio ad
absurdum of particular ontological positions.” Nor is there any emerging
consensus about what would be a satisfying metaphysics. As it happens, I share Chrisley & Sloman’s interest in more classical accounts of QM
findings (in spite of this being a minority view in physics). According to Bohm
and his collaborators, wave-like and particle-like behaviour are manifestations
of a unified, grounding reality (Bohm often refers to this as an “implicate
order”) just as I have claimed experiences and their physical correlates to be
dual-aspects of a unified, psychophysical mind. So adopting a classical
metaphysics in QM (in the way that Chrisley
& Sloman suggest) would make the analogy with dual-aspect monism even
closer!
In sum, let me
stress again that analogies have their purposes, but they are not
homologies. The analogies that I have
used illustrate how phenomenal and physical representational systems might
format the same information in different ways, and how phenomenal and physical
aspects of mind might be tightly bound to each other without reducing to each
other. But I do not claim consciousness to be literally a picture on a TV
screen, a magnetic field, or a wave-like QM phenomenon (to claim all three
simultaneously would in any case be absurd). The relation of any given
conscious experience to its physical correlates has to be understood in its own
terms.
A
re-examination of what we take for granted.
What has ontological
primacy--consciousness or the physical world? In current Western philosophy and science
the existence of the physical world is generally taken for granted, while the
existence of consciousness is thought to be somewhat mysterious. The physical world is also generally assumed
to be the primary reality on which other “emergent” forms of existence such as
mind and consciousness depend. Chrisley
& Sloman for example, take it for granted that the
physical/experiential relationship is asymmetrical. Physical states can exist without accompanying experiences (e.g.
in the form of preconscious brain states)—but conscious experiences cannot
exist without accompanying physical states.
As they note, “The only way to
impose symmetry would be to assume (as others have been forced to do, e.g.
Chalmers, 1996) that whenever there is a physical phenomenon, there is some
experiential phenomenon, however slight or imperceptible or implausible,
accompanying it. Panpsychism threatens.”
It is instructive to note however that
such opinions about what has ontological primacy and what constitutes a ‘threat’
(to right thinking) are not universal. As Rao
points out, very different views about the ontological status and
distribution of consciousness and mind dominate in philosophical traditions
that have developed in the East. In these traditions, the irreducibility of
consciousness to brain states is taken for granted and consciousness, not the
physical world, is thought to be primary.
In some Indian traditions for example, the physical world is thought to
be a projection of consciousness constructed by the mind.
How is it possible that thinkers in the
West and the East have come to such very different conclusions? Note that the ontological primacy of either
consciousness or the physical world is not obvious from the immediate,
empirical “evidence of our senses” for the simple reason that, in everyday
life, conscious experience and what we normally think of as the “physical
world” co-arise. That is to say, what we normally think of as
the physical world just is the 3D
phenomenal world that we experience.[18]
However, Western and Eastern thinkers have traditionally taken a very different
interest in what is experienced.
Western “third-person” science has traditionally been interested in experience
as a means to an end, namely the nature, control and transformation of the
entities and events that such experiences represent
(what they are experiences of) and
has developed investigative methods and technologies appropriate to these
interests. By contrast, Eastern “first-person” philosophy and science has
traditionally been interested in the nature, control and transformation of the
experiences themselves, and has
developed methods appropriate to these aims. I suggest that these different
foci of interest and accompanying methodologies partly explain East-West
differences of opinion about what has ontological primacy.
It is not altogether surprising that if
one’s third-person investigative attention is entirely focused on the material
entities and events that one’s experiences represent, one might conclude their
fundamental nature to be entirely material. Many external entities and events
appear to exist whether they are experienced or not, thereby supporting their
ontological primacy and a form of physical realism. In the human brain some
processes appear to be accompanied by consciousness while others appear to be
preconscious, unconscious or nonconscious, suggesting a physical/experience
asymmetry. Viewed from the outside, the material forms of entities and events
are evident, but not any accompanying experience, even in other human beings
(the problem of “other minds”).
Consequently panpsychism looks dubious.
On the other hand, if one’s first-person
investigative attention is focused in ever finer ways on conscious experience
itself it is not surprising that one might conclude its fundamental nature to
be a refined form of consciousness (traditionally a “pure”, contentless
consciousness). Conscious experience is in any case “immediately given” and is
epistemically primary in the sense that it provides the foundation for the
acquisition of all empirical knowledge. Indeed, what we normally think of as
the “physical world” just is the 3D phenomenal
world that forms part of everyday conscious experience (see above). Conversely,
without conscious experience this phenomenal
physical world would not exist (a form of idealism), thereby providing grounds
for the Eastern view that consciousness has ontological primacy.
Which view is correct? It is not possible to attempt a full
analysis in a few lines. However, in UC and TA I develop a dual-aspect,
reflexive monism that treads a careful path between taking either a first- or
third-person approach to be more privileged or fundamental. Rather, these perspectives are complementary
and mutually irreducible. For example, in Velmans (1990a) and UC chapter 7 I
suggest that Eastern idealism and Western realism may both be true although
they are true about different things. Idealism may be said to apply to the
observer-dependent existence of the phenomenal
world while realism applies to the observer-independent existence of the
entities and events (things themselves) that experienced phenomena represent. Under normal conditions, neither a first- nor a third-person
perspective provides a “view from nowhere,” that is a view of the thing-itself
as it is in-itself, even if the aspect of the thing-itself under scrutiny is
the human mind. Conversely, both
investigative routes can lead to deeper knowledge. Third-person science provides a deeper knowledge of the material
world, understood in a third-person way.
First-person investigations of consciousness provide a deeper knowledge
of one’s own mind, understood in a first-person way. My route to this position
is an entirely conventional Western one, relying on the normal triangulation of
scientific evidence, everyday experience, common sense and theory. Nevertheless, once the implications of this
position are fully worked out (in terms of what consciousness is and does, and
how it relates to the brain and the physical world) the Reflexive Monism that results takes one a long way from current
Western materialism. I conclude for
example that
“Human minds, bodies and brains are embedded in a
far greater universe. Individual
conscious representations are perspectival.
That is, the precise manner in which entities, events and processes are
translated into experiences depends on the location in space and time of a
given observer, and the exact mix of perceptual, cognitive, affective, social,
cultural and historical influences which enter into the “construction” of a
given experience. In this sense, each conscious construction is private,
subjective, and unique. Taken together, the contents of consciousness provide a
view of the wider universe, giving it
the appearance of a 3D phenomenal world.
This results from a reflexive interaction of entities, events and
processes with our perceptual and cognitive systems that, in turn, represent those entities, events and
processes. However, conscious representations are not the thing-itself. In this
vision, there is one universe (the thing-itself), with relatively
differentiated parts in the form of conscious beings like ourselves, each with
a unique, conscious view of the larger universe of which it is a part. In so
far as we are parts of the universe that, in turn, experience the larger
universe, we participate in a reflexive process whereby the universe
experiences itself.” (UC, p233).
Later, I add, “In this sense, we participate in a process whereby the universe observes itself – and the universe becomes both the subject and object of experience. Consciousness and matter are intertwined in mind. Through the evolution of matter, consciousness is given form. And through consciousness, the material universe is real-ized.” (UC, p280).
It is not possible to summarize the full
implications of reflexive monism in a few lines, let alone the case supporting
it. However, as Rao notes, my route appears to travel from West to East. His only
complaint is that I have not traveled far enough! While I do not have space to
deal with how UC relates to various Eastern philosophies in any detail, Rao’s comments provide a welcome
opportunity to assess the internal coherence of TA (and UC) from a very
different perspective, and it is instructive to address his main points.[19]
From West to East?
Any comparison of Eastern and Western
views of “consciousness” and “mind” has to start with a clarification of terms,
for the simple reason that in the West and the East the terms “consciousness”
and “mind” are habitually used in different ways. As Rao notes, I
largely confine my use of the term “consciousness” to phenomenal consciousness—the
everyday experience of the external world, the body, and inner
experiences (such as thoughts, feelings and so on). Although there are many uses of the term “consciousness” in the
West, phenomenal consciousness is arguably closest to its most common
usage. Crucially, it is consciousness
in the sense of “phenomenal consciousness” that poses “hard” problems of the
kind currently discussed in Western philosophy such as “How could conscious
experiences affect the activity of brains?” (the subject of TA)
I also largely follow current Western
conventions in my use of the term “mind.”
As with “consciousness” the term mind has various uses. However, in psychology
it is typical to think of the human mind as that which enables us to function in certain ways (to think, to
solve problems and so on). Although the details of how consciousness, mind and
brain relate are in dispute, there is consensus that “mind” is intimately
connected to both brain and consciousness. A major finding of 20th
century psychology is that mental processes may or may not “be conscious.” Some processes have associated phenomenal
contents, while others are preconscious, unconscious, or nonconscious.
Consequently, in Western psychology, “mind” is commonly thought of as encompassing consciousness.
Eastern common usage of the terms
“consciousness” and “mind” is somewhat different. However it is not difficult to tease out terminological differences
from genuine, theoretical ones. At
first glance, the Samkhya-Yoga tradition described by Rao might look very different to Reflexive Monism. In this tradition, consciousness, with purusha [20] at its center, forms the ground of
one's individual being. It is the
contentless container within which perspectival, phenomenal consciousness takes
form. Mind, unlike consciousness, is physical in that
it can be described in material forms and accounted for in physical terms. “The
purusha as the center of consciousness
is distinct and has unique experiences through its associated mind-body
complex. Such observer dependent
relativity, in Yoga as well as in Vedanta, is not absolutely given but a
transient condition that can be overcome by disciplined practice. The purusha
finds itself reflected in the mind illuminating the material forms of the
universe. Thus mind becomes as
instrument through which the universe reveals itself. Subject-object distinction is not fundamental. It is a contingent manifestation of the
mental process by which the universe is revealed.” (Rao, p x)
In my own analysis in UC I am careful to remain
within the evidence base currently accepted by Western science, and I tease the
modern problems of consciousness away from more traditional concerns with the
nature of the “soul” (UC, pp15-16).
Consequently, I do not comment on the existence or operations of
“purusha”. While I have no doubt that first-person investigative attention can
lead to a deeper understanding of mind (see above) I also remain neutral about
whether disciplined practice can entirely remove one’s observer
dependent relativity, or whether the ensuing conscious state can be entirely
contentless (UC, chapter 1, note2).[21]
Nevertheless, there are broad similarities between Reflexive Monism and the
Eastern view that Rao describes. Like Samkya-Yoga philosophy (and Western
materialism) I accept that mind has (third-person) physical aspects that
provide an instrument for the formation of phenomenal consciousness. I also
accept that both phenomenal consciousness and material aspects of mind are
grounded in something deeper, namely a self-revealing universe in which the
subject-object distinction is not fundamental (see above). However, in my own
analysis the terms I use to refer to what is deeper are different. For example, “consciousness with purusha at its center”, is replaced by the “deeper
nature of mind” (or, in Kantian fashion, “mind-itself”).[22]
These different
uses of terms partly account for a number of confusions in Rao’s summary of my own theoretical position. Rao
notes that both Indian theories and my own make a distinction between
consciousness and mind. But he suggests
that, “In the
Indian view; the distinction is fundamental and primary in the sense that one
is not reducible to the other. In
Velmans’ view, the distinction is secondary and holds good at the
epistemological level and not at the ontological level. Thus consciousness becomes a subcategory or
species of the mind.”
In fact, however, I never suggest that
“consciousness” interpreted in the broad
Eastern sense is an aspect of the material mind interpreted in the narrow Eastern sense (that would indeed be
inconsistent with my view that consciousness cannot be reduced to states of the
brain). What I actually suggest is that
phenomenal consciousness (understood in the
conventional Western sense) is an aspect of the deeper nature of mind (mind-itself). The neural correlates of consciousness and
other forms of brain functioning provide the complementary, material aspect of
mind-itself. Being genuine aspects,
both consciousness and brain have an ontology,
as well as providing first- and third-person means by which the mind can be
known. Consequently Rao is wrong to suggest that the
distinction between mind and consciousness in my own work is purely
epistemological. And he mistakes my suggestion that mind-itself encompasses
consciousness to mean that the material
aspect of mind encompasses consciousness.
Rather, the deeper, psychophysical nature of mind encompasses both its
manifest conscious and material aspects.[23]
If
one replaces Rao’s Eastern use of the
term “consciousness” with my use of the term “mind-itself” or more broadly “the
thing-itself” one immediately clears up a number of other confusions. Rao writes for example, “the distinction
between first-person consciousness and third-person consciousness adds little
to the clarity of the concept of consciousness. Consciousness is consciousness
whether we look at it from a first-person or the third-person perspective. It
may manifest different characteristics at different levels of observation, but
it underlies all awareness.
Consciousness is what makes awareness possible. It is the ground condition for all forms of
awareness, like matter which is the ground condition for all the material forms
we experience.” (p x)
Viewed
in conventional Western terms, Rao’s statement
makes no sense, for the reason that phenomenal consciousness cannot be viewed
from a third-person perspective (whatever the level of observation). In the West, the terms consciousness,
phenomenal consciousness and conscious awareness are often used interchangeably
(I do so in my own work). Consequently
it makes no sense to suggest that consciousness underlies awareness (it cannot
underlie itself). The suggestion that
“consciousness is the ground condition for all forms of awareness, like matter
which is the ground condition for all the material forms we experience” is also
inconsistent with the view that consciousness has ontological primacy over
matter (the alternative is ontological dualism). By contrast, “mind-itself” can be viewed from first- and
third-person perspectives, does underlie phenomenal consciousness and is
the ground condition for both its conscious and material manifestations
(thereby avoiding dualism).
Of course, these different terms for what
has ontological primacy in the East and in the West (and their corresponding
descriptions) also reflect substantive theoretical differences. In Samkya-Yoga philosophy “consciousness with
purusha at its center” is the fundamental
reality. In Western materialism the
physical world is the fundamental reality. In Rao’s opinion I have to choose between these: if I reject the
reducibility of conscious experience to brain states, I must accept the primacy
of consciousness. Not so. I accept that
if one investigates the mind from a third-person Western perspective it will
appear to be entirely physical while if one investigates it from a first-person
Eastern perspective it will appear to be entirely conscious experience (see
above). But, as far as I can judge, neither route to knowledge of the mind is
privileged, incorrigible or complete. Rather, first- and third-person routes to
knowledge of the mind are complementary and mutually irreducible. Consequently,
the “deeper nature of mind” (mind-itself) is better described as
psychophysical.
Rao suggests that, in my own analysis, there is an asymmetry between
conscious states which do not reduce
to states of the brain and nonconscious mental activities which do reduce to brain states. In his view this leads to “the real
problem.” He writes, “Velmans
acknowledges that consciousness is not reducible to brain states or
functions. Yet, he considers
consciousness an aspect of the mind.
The mind in his view is broader to include nonconscious mental
activities as well. Here rests the real
problem. Consciousness (subjective
experience) is irreducible to neural states or brain functions, whereas the
nonsubjective states of the mind are in principle reducible. In the light of such a fundamental
difference between them, it is hardly plausible to argue that consciousness is
a species or an aspect of the mind. The
irreducibility of consciousness to physical states entails that the difference
between conscious and nonconscious aspects of the mind is one of kind, primary
and fundamental. Reducibility or
otherwise of one category into another is an ontological matter and not simply
an epistemological issue.” (p x)
Epistemological symmetries and asymmetries
between first- and third-person perspectives are important, but I agree with Rao that reducibility is an ontological
matter, not an epistemological matter. However, such an ontological asymmetry
between conscious states and nonconscious ones would occur only if the
nonconscious nature of mind turned out to be entirely physical (as
Rao himself assumes). If so, conscious mind would have dual-aspects, but nonconscious mind
would only have a physical aspect. As
it happens, a similar view to this is held by those “nonreductive physicalists”
that adopt property dualism. Whether this leads to a “real problem” depends on
whether such asymmetries actually occur in nature or not (if they do, it would
be perverse to regard them as a problem). Chrisley
& Sloman for example take such asymmetries for granted. As they note, “There is a fundamental asymmetry between the physical and the
conscious: Physical laws apply everywhere, both in situations where there is
and where there is not consciousness, while the converse does not hold. So
there seems to be a primacy of the physical, and one must reply to the idea
that it is this physical, causal reality which is always doing all the work.”
(p x) Rao adopts the opposite view that a pure consciousness without any
material form is the basis of everything, but does not appear to recognize that
this produces an inverse asymmetry (in which physical matter becomes secondary
to consciousness).
Whether such
asymmetries actually occur in Nature is up to Nature—and whether they do or not is largely tangential
to the analysis of consciousness that I have given in TA and UC. It is
important to note however that, unlike both materialism and idealism,
ontological asymmetries are avoidable in dual-aspect theory, which allows for
the possibility that mind-itself has a dual-aspect, psychophysical nature irrespective of whether its operations are unconscious, preconscious or conscious. [24]
On this interpretation, the dual-aspect
nature of mind is fully manifest only in those aspects of mind that are
“conscious”. However, with appropriate
investigative techniques, some preconscious and unconscious aspects of the mind
can become conscious (in the sense
that we can become aware of those aspects or to real-ize their nature).[25]
Unconscious and preconscious aspects of mind can also be thought of as
psychophysical in the sense that they can have causal effects on both conscious experiences and physical states of
the body/brain, for example in the operation of preconscious free will (see
TA).[26]
Note that whatever view one adopts about what is primary, one is left with
the problem of origins. In the West, we
generally accept that the origins and existence of consciousness are somewhat
mysterious (when and why did it emerge?).
But we habitually ignore the fact that the origins and existence of
matter are equally mysterious. Why
should there be anything rather than nothing? As the origins of a
“psychophysical mind” are also mysterious, the choice between these three positions
has to be made on other grounds.
Which view is preferable? Note that there
are “hard” problems associated with taking either the material world or
conscious experience to be more primary than the other. In the West it is well
recognized that taking the material world to be primary leaves one with the
problem of consciousness. How could something like an experience emerge from a
material world that does not already have it?
It is perhaps less well recognized in the East that if one takes the
existence of consciousness to be primary one is left with the inverse problem.
How could something like an independently existing material world emerge from
something like an experience? If the thing-itself and mind-itself are fundamentally psychophysical one avoids
such problems.[27] And one can
then make sense of mind/body interactions observed in clinical practice and
everyday life.
References
Arbuthnott, K.D. (1995),
‘Inhibitory mechanisms in cognition: Phenomena and models’, Cahiers
de Psychologie Cognitive, 14(1),
pp. 3-45.
Baars, B. J., Banks, W. P. and Newman, J. B. (2002) (eds.) Essential Sources in the Scientific Study of Consciousness. The MIT press. (in press)
Chalmers, A. F. (1992), What is this Thing Called Science? Open University Press.
John, E. R. (2002), ‘The
neuropsychology of consciousness’, Brain
Res Brain Res Rev, 39, pp. 1-28.
Kim, J. (1999), Mind in a Physical World, MIT Press.
Libet, B. (2003), ‘Can conscious experience affect brain activity? JCS (in press).
McFadden, J. (2002), ‘Synchronous firing and its influence on the brain’s electromagnetic field: Evidence for an electromagnetic theory of consciousness’, JCS, 9 (4), pp. 23-50.
McFadden, J. (2002), ‘The conscious electromagnetic information (Cemi) field theory: The hard problem made easy?’, JCS, 9 (8), pp. 45-60.
Melzack, R. (1987), ‘The
short-form McGill Pain Questionnaire’, Pain,
30, pp. 191-197.
Metzinger, T. (2000) (ed.), Neural Correlates of Consciousness, The MIT Press.
Miller, G. A. (1956), ‘The magical number seven, plus or minus two: some limits of our capacity for processing information’, Psych Review, 63, pp. 81 – 97.
Perruchet & Vinter
(2003), ‘The self-organizing consciousness’, Behavioral and Brain Sciences (in press)
Pockett,
S. (2002), ‘Difficulties with the electromagnetic field theory of consciousness’,
JCS, 9 (4), pp. 51-6.
Searle, J. (1990),
‘Consciousness, explanatory inversion and cognitive science’, Behavioral and Brain Sciences, 13(4), pp. 585-642.
Thomson, E. (2001) (ed.), Between Ourselves: Second-Person Issues in
the Study of Consciousness, Imprint Academic.
Varela, F. and Shear, J. (1999) (eds.), The View from Within: First person approaches to the study of
consciousness, Imprint Academic.
Velmans,
M. (1990), ‘Is the mind conscious, functional, or both?’, Behavioral and Brain Sciences, 13,
pp. 629-630.
Velmans,
M. (1990a), ‘Consciousness, brain, and the physical world’, Philosophical Psychology, 3, pp. 77-99.
Velmans, M. (1991a), ‘Is human information
processing conscious?’, Behavioral and
Brain Sciences, 14(4), pp.
651-701.
Velmans, M. (1999), ‘Intersubjective
science’, Journal of Consciousness Studies, 6(2/3), pp. 299-306.
Velmans,
M. (1995), ‘The relation of consciousness to the material world’, Journal of Consciousness Studies, 2(3), pp. 200-219.
Velmans, M. (2001), ‘Heterophenomenogy
versus critical phenomenology: a dialogue with Dan Dennett’,
http://cogprints.ecs.soton.ac.uk/archive/00001795/index.html
[1] As Torrance points out (in his note 2), some philosophers have tried to defend identity theory by arguing that Leibniz's law does not apply to 'referentially opaque contexts'. I might have a twinge in my knee and just not know it to be identical to neural-bodily state S, so I might conclude that they are not identical even though they are. Torrance is cautious about this argument and I share his caution. In some future state of neuroscience we can envision having a given experience C, knowing all about its physical correlates B, and still not being convinced of their identity (given Leibniz's law).
[2] Note that this block to establishing the ontological identity of conscious states with correlated physical states applies irrespective of the level of organisation of the physical states. That is, the block applies just as much to so-called “nonreductive physicalism” as it does to old-style reductive physicalism. Given this, it is not obvious how Van Gulick's suggestion that “higher level regularities might apply “in virtue” of lower level ones” would actually resolve the problem of causal closure.
[3] Whether more primitive forms of material organisation are associated with more primitive forms of conscious experience is a separate (controversial) issue that we need not address here. Panpsychists such as Penrose and Hameroff for example suggest that even microphysical events are associated with primitive experiences.
[4] In Velmans (1990) for example, I defend the conventional cognitive view that many mental states are unconscious and take issue with Searle's (1990) “connection principle” which explicitly links the criterion for being mental to being potentially conscious.
[5] This
also addresses Chrisley
& Sloman's point that, “We’d like to think that our conscious states
have causal power by virtue of their being the mental states that they are, not
by virtue of being identical with some physical state, which itself has, by
virtue of falling under physical laws, the true causal power.” Indeed! And that
is yet another reason for rejecting any version of physicalism. In the above analysis, conscious experiences
are not identical to (correlated)
physical states. Nor do they “supervene” on physical states (with the
implication that the latter are ontologically more basic). They are first-person
manifestations of the operations of our own psychophysical minds. They have causal powers in the sense that
any phenomena can have causal powers. Although they only represent the
operations of mind-itself (ur mind), from a first-person perspective we can
take them to be the operations of
mind.
[6] The
need to have both first- and
third-person accounts for a complete account of mind makes it clear why such
accounts do not face the problem of “overdetermination” (see Chrisley & Sloman).
[7] Note that falsifiability is one useful criterion of a good scientific theory, but it is not an infallible criterion or the
only criterion (other tests include verifiability, explanatory elegance,
simplicity, sufficiency, productivity and so on). See Chalmers (1992) for a
useful introduction.
[8] Chrisley & Sloman suggest (in their note 4) that I should say that “the physical aspect must contain at least as much information structure as the experiential aspect” rather than claiming them to have “identical information” as the physical aspect will typically encode more information than the experiential aspect. I do not deny that the brain encodes far more information than that which is manifest in conscious experience, or that this information may support the formation and functioning of the correlates of experience. However, information encoded in the brain that is not encoded in experience is not, in the strict sense that I intend, a “correlate” of that experience.
[9] As Chrisley & Sloman point out, it important to distinguish the functions that are implemented by a system from the methods it uses to implement those functions. They present this as an issue on which we disagree, suggesting that a strong phenomenal experience/neural correlate “mirroring” is implicit in my analysis. But, as should have been clear from TA note 10, this is actually an issue on which we agree.
[10] Standard measuring instruments include verbal rating scales, numerical rating scales, visual analogue scales and questionnaires such as the McGill Pain Questionnaire (Melzack, 1987).
[11] Rakover also complains that phenomenal measurements cannot meet the requirements of objectivity, publicity, and repeatability. I disagree. However this is a large topic on which I have written on extensively, both in this journal (Velmans, 1999) and in UC chapter 8. Given the limitations on space in this reply I ask interested readers to refer to these prior sources.
[12] If the physical differences can be consciously perceived we can say that information about physically measurable differences has been successfully “transmitted” or transformed into discriminable, phenomenal differences. Note that it is often possible for physical differences in stimuli to be detected in spite of not being consciously perceived (for example if subjects are required to guess). As this is tangential to the point at issue I will not elaborate on it here.
[13] Chrisley & Sloman go on to claim, “It is this need to distance himself from physicalism which raises the second problem with the analogy: he admits that the videotape and the screen are ontologically distinct, yet he was supposedly defending an ‘ontologically monist’ position! It seems Velmans ends up with the converse of the position for which he was aiming: ontological dualism, but epistemological monism (in the sense that strong assumptions are made about ‘informational mirroring’).” This confused analysis of the intent of my videotape/TV screen analogy needs some unravelling. It is true that conscious experiences and their neural correlates have distinct (phenomenal and physical) characteristics and in that sense may be said to have distinct ontologies. But this does not prevent them being aspects of an underlying, unified mind, thereby making my dual-aspect theory ontologically monist (in the tradition of Spinoza). Nor does the possession of distinct phenomenal and physical characteristics prevent experiences and their correlates from encoding identical information. The videotape/TV screen analogy provides one example of how representational systems can encode identical information without having identical characteristics. It should have been obvious that I did not mean to suggest that brain states are literally a form of videotape and experiences literally a kind of TV screen or that experiences can somehow be decoupled from their physical correlates! Nor does it make sense to interpret the view that one can know (or represent) one thing in two different ways “epistemological monism.”
[14] As it happens, a psychophysical theory relating information encoded in the brain’s electrochemistry to a pooled, integrated form of the same information encoded in the brain’s electromagnetic field has recently been proposed in this journal by McFadden (2002a,b). According to McFadden this EM field is the physical substrate of phenomenal consciousness (see also Pockett, 2002; John, 2002). While I am not committed to the details of this theory, and do not think it solves the “hard” problem (the EM field would still have to have dual-aspects to bridge the gap from physics to phenomenal experience), it does illustrate the type of theory that I have in mind.
[15] The fact that one has different (first- and third-person) forms of access to these (phenomenal and physical) aspects of mind does not alter the point that these aspects specify the mind's ontology.
[16] It is hard to know what Chrisley & Sloman mean by a “magnetic way of looking at the situation.” Unlike them, I do not confound the dual-aspect ontology of mind, or the way information is formatted within its phenomenal and physical aspects, with first- versus third person ways of examining the mind's phenomenal and physical aspects. Likewise, I do not confound the electrical and magnetic manifestations of electromagnetism, or the possibility of encoding information in either electrical or magnetic formats, with the different ways in which we can investigate electricity and magnetism.
[17] As I point out in TA note 14,
I am only concerned here with the broader implications of dual-aspect monism.
Consequently, it seems to me useful to suggest that there might be a
psychophysical unity underlying the phenomenal and physical aspects of mind,
that is broadly analogous to the electromagnetic unity underlying electricity
and magnetism. It goes without saying that I am not suggesting that conscious
phenomenology is magnetism, or that its
physical correlates are electricity.
The precise way that given conscious experiences map onto their
physical correlates can only be discovered by neuropsychological research and,
in this sense, “requires further work.”
[18] The appearance of the 3D phenomenal physical world is not of course identical to the more abstract world described by physics (quantum mechanics, relativity, string theory and so on). The relation of the phenomenal physical world to the world described by physics is central to a proper understanding of consciousness/material world relationship and I discuss this in depth in UC chapters 6 and 7. As this relationship is somewhat tangential to the issues raised in TA and the commentaries I will not elaborate on it here.
[19] Rao also raises a number of questions in passing, for example, what is meant by “information structure”, what encodes that information, how does “perspectival switching” work, and in what sense is information viewed from a first- and a third-person perspective complementary. As I have dealt with these issues earlier in this reply I won’t return to them here.
[20] In the Samkhya-Yoga tradition, Purusha refers to one's true, individual, immaterial essence (also referred to as Atman or soul).
[21] Rao writes
that “Velmans speaks of direct and indirect knowledge, as the Indian theories
do. … In Yoga theory, even the so-called first-person experience is indirect,
because what the mind presents to consciousness are representations mediated by
the perceptual and cognitive systems. Consequently, awareness arising from such
mediation is also indirect. In other words, in Velmans, the direct acquaintance
is with the representations, whereas in Yoga it is with the things
themselves. Such direct knowledge
results when the mind detaches itself from the sensory inputs and makes contact
directly with the objects, events and processes in the universe. This is what
may be labelled as paranormal process distinguished from the normal process in
which there is the involvement of the sensory processes.” (p x) However, this doesn’t quite capture the
similarities and differences between Reflexive Monism and the Yoga theory that Rao describes. In UC chapter 7 I
develop the view that, under normal conditions, we have direct acquaintance
with our own experiences but only indirect acquaintance with the
things-themselves that such experiences represent. Given that normal
experiences are representations, I agree with the Yoga view that they only
provide indirect knowledge of things themselves, even when the things
themselves that we experience are the operations of our own minds. Nevertheless, contra Kant, I argue that the
thing-itself (including mind-itself) is knowable through the representations
that we have of it, and the aim of both first- and third-person science is to
achieve deeper, more complete knowledge.
Knowledge can be gained through direct acquaintance, by experiencing
that which one seeks to know, or indirectly, through the use of symbols
(description, theory and so on). But it
is only through direct experience that things become subjectively real
for us (we real-ize what they are like). One can only really know love
for example by real-izing what it is like to be in love. This, I think, gets quite close to the Yoga
view, with the caution that I remain non committal about the possibility (often
suggested in Yoga philosophy) that it is possible for embodied human beings to fully
know (real-ize) the thing-itself as it is in itself, that is to have “direct”
knowledge in the sense of knowledge that is “perfect and complete”. I also do not comment in UC on the nature
and existence of paranormal phenomena.
[22] In this usage, mind-itself is that aspect of the thing-itself (the ground of being) that forms the basis of the manifest aspects of one’s own mind (i.e. its third-person, material and first-person, phenomenal consciousness aspects).
[23] Other confusions in Rao’s commentary can be traced to
differences in use of terms combined with differences that arise from taking a
first-person route to the nature of mind to be more primary than a third-person
route. Rao for example claims that equating consciousness with
phenomenal consciousness entails confusion “between the contents and the
container, between substance and form.” I accept that, viewed from an Eastern
first-person perspective, a form of pure contentless consciousness might appear
to underlie everyday phenomenal consciousness, and the former therefore is
viewed as the container of the latter.
However, my own, somewhat different dual-aspect analysis does not
confuse contents and container, or substance and form. Rather, the container is
mind-itself and the suggested nature of this container is a little different.
While I remain open to the view that with appropriate first-person training,
the nature of mind itself appears as a form of pure, contentless consciousness,
dual-aspect monism would suggest that even a conscious state as basic as this
would have correspondingly basic, physical aspects that could, in principle, be
discovered by empirical research. As the nature of mind-itself
encompasses all its aspects it seems more accurate to describe it as
psychophysical.
[24] My analysis of consciousness (in TA and in UC chapters 1-11) deals largely with phenomenal consciousness in humans and is consequently neutral about whether there is a first-person aspect (latent or manifest) in states other than those that actually have manifestations in (recognisable) conscious experience. UC chapter 12, however, is more speculative and considers the evolution and distribution of consciousness. It compares “discontinuity” theory (that consciousness appeared suddenly at a given point in evolution) with “continuity” theory (that the potential for recognisable consciousness was there from the beginning and evolved in form as matter evolved in form). Although little of my detailed analysis of consciousness depends on it, I argue that the latter is more intellectually elegant, and it fits more naturally into Reflexive Monism. The view that consciousness is a natural accompaniment of material forms also has implications for how one might think about the necessary and sufficient conditions for consciousness in the human brain. Rather than thinking of consciousness as something that is mysteriously added to representations at the focus of attention, it can be thought of a natural aspect of neural information processing as such. Why is it apparently absent in unconscious and preconscious processing? One possibility is that, in the evolution of complex brains with multiple sources of information, massive inhibition of information became a biological necessity to enable focus on information of greatest importance, and with it, inhibition of consciousness. On this view, unconscious and preconscious mental processes have inhibited consciousness. Conversely, information at the focus of attention is subject to release from inhibition (see Arbuthnott, 1995, for a review of the evidence, and the discussion of this and alternative theories in Velmans, 1995). Another possibility is that representations at the focus of attention are activated to a degree that masks any consciousness associated with other representations, rather like an orchestra on stage masks whispers in the audience.
[25] Many methods have been developed in both the West and the East for gaining conscious access to otherwise nonconscious aspects of mind, ranging from methods to aid recall of unconscious material in cognitive psychology and psychotherapeutic practice, close attention to the phenomenology of otherwise preconscious mental operations (Varela & Shear, 1999), meditative practices in Yoga and so on.
[26] Due to lack of available space I discuss the notion of “preconscious free will”, introduced in TA, in a later issue of JCS along with commentaries by Libet, Mangan, and Claxton. It is interesting to note, however, that Gray, and Chrisley & Sloman wholeheartedly agree that free will is preconscious as well as conscious, in spite of there being other aspects of my analysis with which they disagree.
[27] Although, following Rao, I have presented the Eastern view as idealist, it is important to note that there are as many differences in Eastern philosophy about these basic issues as there are in Western thought. The combined material and conscious nature of the thing-itself is well recognised, for example, in major, modern interpretations of Vedanta such as that of Aurobindo.