Velmans, Max (1996) Consciousness and the "Casual Paradox".  Behavioral and Brain Sciences, 19 (3): 538-542.  Copyright Cambridge University Press
(Reply to continuing commentaries on Velmans, M (1991) Is human information processing conscious? Behavioral and Brain Sciences, 14, 651-669) 

CONSCIOUSNESS AND THE "CASUAL PARADOX"

Max Velmans
Department of Psychology
Goldsmiths
University of London
Lewisham Way
London
SE14 6NW
England

Email: m.velmans@gold.ac.uk
URL: http://www.gold.ac.uk/academic/ps/velmans.htm
 
 

KEYWORDS: psychological complementarity, causality, consciousness, first person, third person, causal paradox, mind, conscious process, perspectival switching, mixed perspective explanations


ABSTRACT. Viewed from a first-person perspective consciousness appears to be necessary for complex, novel human activity - but viewed from a third-person perspective consciousness appears to play no role in the activity of brains, producing a "causal paradox". To resolve this paradox one needs to distinguish consciousness of processing from consciousness accompanying processing or causing processing. Accounts of consciousness/brain causal interactions switch between first- and third-person perspectives. However, epistemically, the differences between first- and third-person access are fundamental. First- and third-person accounts are complementary and mutually irreducible.

In psychological theorising over the last twenty years consciousness has been thought to play an important role in every major phase of human information processing ranging from input (the analysis of novel or complex stimuli, selective attention), storage (working memory, learning), transformation (thinking, problem solving, planning, creativity), and output (speech, writing, novel or complex adaptive adjustments to the enviroment). In my analysis of this literature (Velmans 1991a,b,1993) I concluded that viewed from a first-person perspective consciousness does play a role in these different forms of processing. That is, if one examines one's own psychological functioning, conscious appears necessary for the anaysis of novel or complex stimuli, choosing what to attend to or do, and most forms of learning and memory. It also seems necessary for most novel or complex cognitive transformations and output - how, after all, could one plan, be creative, give a lecture or write a paper if one were not conscious?

However if one examines human information processing from a third-person perspective, that is from the perspective of an external observer, then consciousness does not appear to be necessary for any form of processing. The operation of minds and brains seems to be explainable entirely in functional or physical terms which make no reference to consciousness (see also Gray, 1995; Velmans, 1995a). For example, once the processing within a system required to perform a given function is sufficiently well-specified in procedural terms, one does not have to add consciousness to make the system work. In principle, the same function operating to the same specification could be performed by a nonconscious machine. Likewise if one inspects the operation of the brain from the outside, no subjective experience can be observed. Nor does one need to appeal to the existence of subjective experience to account for the observable neural activity.

This produces a paradox, which I have called the "causal paradox" (Velmans 1991b, p716): How can it be that from a first-person perspective consciousness appears to be necessary for most forms of complex or novel processing whereas from a third-person perspective it does not appear to be necessary for any form of processing?

Commentators on my target article attempted to address this paradox in different, conflicting ways. Some argued that consciousness does not exist, or is a confused concept that has no bearing on a scientific understanding of the mind (Stanovich, 1991; Sloman, 1991; Rey, 1991); some argued that how things appear from a first-person perspective does not matter for science, in which case, for scientific purposes, the causal paradox can be ignored (e.g. Hardcastle, 1991); others tried to finesse the issue by redefining consciousness in information processing terms, for example, as being synonymous with focal attention, the contents of short-term working memory, and so on (e.g. Baars, 1991; Block, 1991; Bowers, 1991; Glicksohn, 1993; Wilson, 1991). By contrast, Rakover seeks to establish a causal role for consciousness through his two, "parallel" stories, the "mental-pool" and "cognitive-pool" thought experiments. To place his analysis within the debate as a whole I must first summarise some aspects of my own position.

In Velmans (1991a,b, 1993) I assumed:

1. that the existence of consciousness (in ourselves) is undeniable

2. that how things appear from a first-person perspective matters a great deal for our everyday lives, and also provides useful information for science, particularly psychology (reports of experienced events enter into uncountable numbers of psychology experiments)

3. that redefining consciousness to be a form of human information processing ignores its central, phenomenal properties, the "qualia" of first-person experience; consequently such redefinitions finesse the paradox of consciousness/brain causal interaction without addressing it (the difficulty of incorporating the phenomenal properties of consciousness within functional descriptions of the mind, is widely acknowledged).

Once one accepts that consciousness and its contents (viewed from a first-person perspective) provide valuable psychological data one can get on with the business of working out how given conscious states relate to given forms of processing in minds or brains (viewed from a third-person perspective) in a way that does not prejudge either the ontological nature of consciousness, or its causal status. This involves a detailed examination of (a) how given conscious states relate to their causes or correlates (specified, say, in information processing or neural terms) and (b) how the first-person information (about what is going on) provided by conscious states relates to information available to external observers.

Note that (a) is quite different from (b); (a) has to do with examining where and how consciousness "fits into" the causal sequence of events taking place in the mind/brain; (b) has to do with the fact that conscious states are always about something, that is they provide information to those who have them (about the world, about themselves, and so on), which may or may not be similar to the information available to external observers (about the same things). When discussing the senses in which a process may be said to be "conscious" it is useful to bear these distinctions in mind (as we will see below).

My review of the literature (in Velmans 1991a) ranged over all the main phases of human information processing, from information encoding, storage, retrieval, and transformation to output. In the light of many claims that have been made in the literature that "conscious processing" is necessary for any tasks that are novel or complex, not just viewed from a first-person perspective, but from a third-person perspective, I considered whether a given process could occur without consciousness (if so, consciousness could not be necessary for its operation). If a given process is accompanied by consciousness, I asked where in its causal sequence does consciousness appear (if consciousness appears subsequent to the operation of the process it cannot enter into that process). I also examined how the information present to consciousness relates to the processing it accompanies (for example, whether conscious information reveals anything about the nature of accompanying processing).

I will not repeat that review here. But, to get a quick sense of how a more detailed analysis of the role of consciousness in "conscious processing" can produce surprising results, try silently reading the sentence "The dustmen said that they would refuse to collect the refuse." Reading, including silent reading of a sentence at the focus of attention is widely thought of as a "conscious process" in cognitive science. It is generally thought that sentences, being complex, novel stimuli are beyond the capacity of preconscious processing (Underwood, 1991; Baars & McGovern, 1996). Note though, that once the silently read sentence appears in consciousness in the form of phonemic imagery the stress pattern on the word "refuse" depends on where it appears in the sentence. On its first occurrence it is (silently) pronounced refuse and on its second occurrence as refuse - stress patterns appropriate to its initial use as a verb and its subsequent use as a noun. Reading is undoubtedly a complex process involving visual pattern recognition, semantic analysis, syntactic analysis, the relating of input to general knowledge of the world, and in the case of silent reading the translation of the visual input (the printed text) into the phonemic imagery which characterises "inner speech" or "verbal thoughts." But once the word "refuse" appears in consciousness, a stress pattern appropriate to its use as a verb or a noun has already been assigned. If so, not just visual pattern recognition, but all the semantic and syntactic analysis required to determine the appropriate function of the word within the context of the sentence as a whole must have taken place preconsciously. In short, while reading a sentence at the focus of attention is "conscious" in the sense that it results in conscious phonemic imagery, the processing itself is preconscious (see Velmans 1991a, p657 for a more detailed analysis of such cases).

For reasons such as this I concluded that the conventional categorization of processes into "conscious" vs "nonconscious" (or conscious vs unconscious) is too crude to capture the intricacies of how consciousness relates to human information processing. One has to specify the sense in which a process is "conscious." A process might for example be "conscious"

(a) in the sense that one is conscious of the process

(b) in the sense that the operation of the process is accompanied by consciousness of its results

(c) in the sense that consciousness enters into or causally influences the process (Ibid, p666).

Under normal circumstances reading is conscious in sense (b) but not in sense (a). That is, one is conscious of the results of the processing, but not conscious of the processing itself. Consequently the details of such processing can only be inferred from psychological research. However, other psychological processes have conscious manifestations which do provide some information about the processes themselves. For example, the verbal thoughts which appear in consciousness when one is trying to solve a problem relate not only to the problem but also reveal something about the processes involved in arriving at a solution. Problem solving may therefore be said to be conscious in both senses (a) and (b) (consequently introspection may provide a useful adjunct to inferences from psychological research).

But note that a process may result in a conscious experience (which may or may not reveal something about the antecedent process) without the conscious experience having a causal influence on that process. That is, a process may be conscious in sense (a) and sense (b) without being conscious in sense (c). The phonemic imagery resulting from silent reading for example follows sentence analysis and cannot therefore enter into it.

It also has to be borne in mind that processing which results in conscious experience is likely to be functionally different to processing that operates without any accompanying consciousness. For example, processing at the focus of attention generally results in conscious experiences whereas processing outside the focus of attention does not. Of course, one cannot conclude from this that the functional differences between attentive and nonattentive processing are due to consciousness. Rather, consciousness appears to be a late-arising product of focal-attentive processing (as in the case of silent reading discussed above) (see detailed discussion Ibid, p665, Velmans, 1991b, p709).

These subtleties have been largely ignored in the many commentaries on my target article and in the cognitive literature at large. Rakover is no exception. But they are central to any serious discussion of whether or not consciousness enters into human information processing or merely results from or accompanies it. This, in turn, is central to a resolution of the "causal paradox."

It might of course be that once consciousness emerges as a late-arising product of focal-attentive processing that it has some causal role. As Rakover notes, the appearance of given conscious states is ordered in time and it is reasonable to expect that for any given conscious state there will generally be prior and subsequent states (both unconscious and conscious). While given conscious states cannot influence prior states, they might in principle influence subsequent ones (see also Mandler, 1991). This is the main point of Rakover's "mental-pool" thought experiment. That is, he agrees with me that the initial processing of input information is carried out unconsciously, and that some unconscious states can produce conscious states. But he goes on to suggest that these conscious states, in turn, can produce unconscious states - in which case there are "effective interactions" amongst these. Consequently he writes, "in contradiction to Velmans, the thought experiment proposes that consciousness takes place in many chains of mental events and therefore plays an important role in the explanation of behavior."

As it happens, this is not "in contradiction to Velmans." I agree with Rakover that consciousness plays an important role in the explanation of behaviour in situations of the kind described by his "mental-pool" experiment. As he points out, "the mental-pool experiment is based on .. first-person or phenomenological generalized observations and their inferred phenomenological conclusions" - and, as I have argued, "Viewed from a first-person perspective, consciousness is central to the determination of human action" (Velmans 1991a, p667). However, to address the "causal paradox" one has to consider how such first-person accounts relate to third-person accounts of neural activity or human information processing.

Rakover goes on to give such an account in his third-person IPA (information processing approach) or "cognitive-pool" thought experiment. This provides a "parallel conclusion" about what is going on. Information moves from being unattended (in long-term memory) to being attended (in short-term memory) and back again, and there are effective interactions amongst these. Again, I agree with Rakover that an account (along these lines) can be given of what is going on that is, broadly speaking, "parallel" to the first-person phenomenological account. In the terminology adopted in my own papers, I have suggested that such first- and third-person accounts are "complementary" (cf Velmans 1991a, section 9.3, 1991b, section R9, 1993, section R6).

Given these similarities in our positions, what is really at stake here? In his summary of my own papers Rakover states that "Velmans (1991a,b) proposed that consciousness plays a minor explanatory role in the information processing approach, and that unconscious mechanisms process stimuli, responses and intervene between them" (his abstract). However, my actual conclusion was not that consciousness plays a minor explanatory role in the information processing approach. Viewed from a first-person perpective consciousness plays a central role - but if one views what is going on in the brain solely in third-person, information processing terms, consciousness appears to play no role whatsoever (the causal paradox)!

Rakover's 'contrasting' position is that "consciousness plays an important role in the explanation of behavior" (his abstract). In order for this position to differ from my own, this would have to be an important role in information processing accounts of behaviour. But the only arguments he gives for this important role derive from his "mental-pool" thought experiment, which views what is going on from a first-person phenomenological point of view! While such first-person "mental-pool" accounts may have "parallel", third-person "cognitive-pool" accounts, it does not follow from this that consciousness has an important role in human information processing.1 This would finesse (or fudge) the issue.

In this regard, consider Rakover's argument that once consciousness arises it could influence subsequent events. In silent reading (discussed above), once a sentence appears in consciousness in a visual or phonemic form, this experience might influence subsequent states and activities such as the interpretation of subsequent sentences or the performance of some consequent overt action. Note though, that in Rakover's own account, every time an experience appears in consciousness, "parallel" information arrives in short-term memory as a result of attentive processing. This information, presumably encoded in some neurophysiological form, could be described from a third-person perspective without reference to its conscious accompaniments - as could the effect of such short-term encodings on subsequent neural or overt action. That is, if matters are viewed solely from a third-person perspective, such neural correlates of consciousness would always replace consciousness itself in functional accounts of the brain - returning us to the "causal paradox" outlined above.

My own conclusion, given the evidence and the many lacunae surrounding this issue, was that first-person and third-person accounts remain complementary and mutually irreducible (Velmans 1991a, p667). This combined "complementarity" and "irreducibility" derives from the fact that these are accounts of observations derived from different forms of access to the events described. For example in the situations described by the mental-pool and cognitive-pool thought experiments a subject has first-person access to the sequencing in time of his own conscious and unconscious states. But the subject does not have access to the detailed operation of his own focal-attentive processing, nor to the embodiment of that processing in neurophysiological hardware. An external observer, by contrast has third-person access (in principle) to the subject's brain, and through a study of input-output relations, to intervening information processing. But the external observer does not have access to the succession of conscious and unconscious states in the subject.

Once one accepts that there are two fundamental forms of access to mental life (first- and third-person) the paradoxical nature of consciousness/brain interaction can, to some extent, be understood (cf Velmans, 1991b, section R9.2, 1993, section R5). While for any individual there is just one mental life (ontological monism)1, accounts of what appears in consciousness, or of information/brain processing, view that mental life from two fundamentally different perspectives (epistemological dualism). Such accounts can be first-person, third-person, or both. Accounts which are purely first-person or purely third-person do not speak of consciousness/brain causal interactions. For example the "mental-pool" thought experiment describes what is going purely from a first-person perspective in terms of conscious phenomemology; the cognitive-pool thought experiment attempts to describe what is going on purely in third-person information processing terms.

Accounts which do speak of consciousness/brain interaction are really mixed-perspective explanations, which employ perspectival switching. In psychophysics, for example, causal explanations typically start with stimuli (in the world or the brain) observed from a third-person perspective by the experimenter, and switch to resulting experiences, reported from the first-person perspective of the subject. Accounts of the effects of conscious experiences on subsequent brain or body states typically start with the first-person experiences of the subject and then switch to the resulting brain or body changes (observed by the experimenter) (cf Sheikh, et al., 1996). Mixed-perspective explanations can sometimes be unscrambled so that they become pure first- or third-person accounts; for example, in some future neurophysiology one might be able to replace a report of what is experienced with an account of the neural correlates of what is experienced in psychophysics experiments. However, there might be no point to such a third-person reduction. If the aim of the experiment is to chart the way in which physical changes are translated into experienced events, the mixed-perspective account remains the relevant one. That is to say, a mixed-perspective causal explanation may be entirely appropriate depending on the uses to which it is put.

Epistemically, the differences between first- and third-person access are a fundamental given of the human condition. That is, we only have access to our own conscious experiences and access to other minds, brains or things only in terms of what can be viewed or inferred from the outside. In order to develop a more complete picture of how consciousness as such relates to minds or brains viewed from the outside one needs to relate first- to third-person descriptions (of experiences and brain/information processing respectively) without confounding these accounts. That is, it makes sense to speak of brain states causing conscious experiences or the reverse, only if one acknowledges that one is switching perspectives. Conversely, it does not make sense to claim that states of consciousness really are aspects of information processing, nor that states of consciousness play a (major or minor) role in pure, third-person information processing accounts of the brain. As I have argued in Velmans (1991a) information processing models that claim to incorporate consciousness within their workings are ultimately reductionist. They collapse the subject's first-person perspective into the external observer's third-person pespective, a collapse which a complementarity principle would not allow (Ibid, p667).

NOTES

1. As Leibniz (1686) pointed out, the apparent existence of "parallel" series of mental and physical events could be simply explained by some pre-established harmony. Rather than there being any causal, mind/body interaction, these might be like two perfectly aligned clocks, each keeping time exactly with the other - giving the illusion of causal interaction ("Parallelism"). While a "pre-established harmony" (established by God) has little current favour, the point that parallel events do not entail causal interaction still holds good (correlation does not entail causation).

2. Elsewhere I have suggested that this one mental life may be thought of as a special kind of information developing over time. At the interface of consciousness with the brain (at the interface of consciousness with its neural correlates) this information takes the form of representations (of the external world or of oneself) which are accessible from first- and third-person perspectives, although the format in which that information appears depends on the perspective from which it is viewed. That is, accessed from a first-person perspective this information gives structure to phenomenal experience; accessed from a third-person perspective, the same information will appear to be encoded (within the neural correlates) in some neurophysiological form (Velmans, 1991b, section R9.3, 1993, section R3.3, R5, Velmans, 1995b). A similar position has recently been put by Chalmers (1995).

REFERENCES

Baars, B.J. (1991) A curious coincidence? Consciousness as an object of scientific scrutiny fits our personal experience remarkably well. Behavioral and Brain Sciences 14(4):669-670.

Baars, B.J. and McGovern, K. (1996) Cognitive views of consciousness: What are the facts? How can we explain them? In: The Science of Consciousness: Psychological, Neuropsychological, and Clinical Reviews, ed. M.Velmans. Routledge.

Block, N. (1991) Evidence against epiphenomenalism. Behavioral and Brain Sciences 14(4):670-672.

Bowers, K.S. (1991) (Un)conscious influences in everyday life and cognitive research. Behavioral and Brain Sciences 14(4):672-673.

Chalmers, D (1995) Facing up to the problem of consciousness. Journal of Consciousness Studies 2(3):200-219.

Glicksohn, J (1993) Putting consciosness in a box: Once more around the track. Behavioral and Brain Sciences 16(2):404.

Gray, J (1995) The contents of consciousness: A neurophysiological conjecture. Behavioral and Brain Sciences (in press).

Hardcastle, V.G. (1991) Epiphenomealism and the reduction of experience Behavioral and Brain Sciences 14(4):680.

Leibniz, G.W. 1686. Discourse of Metaphysics, Correspondence with Arnauld, and Monadology. Trans. by M. Ginsberg, London: Allen & Unwin, 1923.

Mandler, G. (1991) The processing of information is not conscious, but its products often are. Behavioral and Brain Sciences 14(4):688-689.

Rey, G. (1991) Reasons for doubting the existence even of epiphenomenal consciousness. Behavioral and Brain Sciences 14(4):691-692.

Sheikh, A.A. (1996) Somatic consequences of consciousness. In: The Science of Consciousness: Psychological, Neuropsychological, and Clinical Reviews, ed. M.Velmans. Routledge.

Sloman, A. (1991) Developing concepts of consciousness. Behavioral and Brain Sciences 14(4):694-695.

Stanovich, K.E. (1991) Damn! There goes that ghost again! Behavioral and Brain Sciences 14(4):696-698.

Underwood, G. (1991) Attention is necessary for word integration. Behavioral and Brain Sciences 14(4):698.

Velmans, M. (1991) Is human information processing conscious? Behavioral and Brain Sciences 14(4):651-669.

Velmans, M. (1991) Consciousness from a first-person perspective. Behavioral and Brain Sciences 14(4):702-726.

Velmans, M.(1993) Consciousness, causality and complementarity. Behavioral and Brain Sciences, 16(2):409-416.

Velmans, M. (1995a) The limits of neurophysiological models of consciousness. Behavioral and Brain Sciences (in press).

Velmans, M. (1995b) The relation of consciousness to the material world. Journal of Consciousness Studies 2(3):255-265.

Wilson, T.D. (1991) Consciousness: Limited but consequential. Behavioral and Brain Sciences 14(4):701.