What, in essence, characterizes the mind? According to Searle, the potential to be conscious provides the only definitive criterion. Thus, conscious states are unquestionably "mental"; "shallow unconscious" states are also "mental" by virtue of their capacity to be conscious (at least in principle); but there are no "deep unconscious mental states" - i.e. those rules and procedures without access to consciousness, inferred by cognitive science to characterize the operations of the unconscious mind are not mental at all. Indeed, according to Searle, they have no ontological status - they are simply ways of describing some interesting facets of purely physiological phenomena.
Given the thrust of this argument, one might be forgiven for believing that for Searle, conscious states and shallow unconscious states are not purely physiological phenomena. But one would be wrong. Searle is a physicalist. Deep unconscious states are purely physiological, shallow unconscious states are purely physiological (but with the capacity to be conscious) and conscious states are also physiological, although with higher order emergent properties (see note 4). In short, when Searle, contrasts conscious mental states and unconscious mental states, with purely physiological states, he means to contrast the physical and the physical with the physical.
So what is all the fuss about? What is crucial, according to Searle, is whether a state has intentionality, and in the target article this is determined by whether or not the state has aspectual shape (section II, step 2). What characterizes the "mental" is that "whenever we perceive or think about anything, it is always under some aspects and not others that we think about that thing". When mental states are conscious it is easy to see how this works out in practice. A conscious desire for water, for example, is not the same as a conscious desire for H2O, although the referent of the desire may be the same in both cases. But how can an unconscious state have aspectual shape? Only in so far as it has the potential to be conscious, claims Searle, for aspectual shape "cannot be exhaustively or completely characterized solely in terms of third person, behavioral, or even neurophysiological predicates" (section II, step 3). Without reference to consciousness, he argues, there would be no way of distinguishing a desire for water from a desire for H2O. Shallow unconscious states, therefore, are rendered "mental" solely by virtue of their connection to consciousness (the Connection Principle) but deep unconscious states, lacking this connection, are not.
The committed Physicalist reader might well feel uneasy at this point. If shallow unconscious states cannot be characterized entirely in terms of physiological predicates, how can they be entirely physiological, as Searle claims? Even worse, what is so special about being conscious that lies forever beyond the domain of any conceivable third-person description? But Searle sails on. The problem, he says, has nothing to do with ontology, but with epistemology. Shallow unconscious states and conscious states are just physiological states, but their aspectual shape cannot be known without subjective access to consciousness.
In one sense, of course, Searle is right. In so far as aspectual shape is manifest in conscious experience, it cannot be fully known from the outside, as we do not have full knowledge of other people's conscious experience. Searle's argument that aspectual shape can only be known via subjective conscious experience, however, is essentially circular. While the fact that, "whenever we perceive or think about anything, it is always under some aspects and not others that we perceive and think about that thing" might, in principle, apply to both conscious and unconscious perceptions and thoughts, according to Searle unconscious perceptions and thoughts are just physiological states made "mental" by their potential connection to consciousness. His claim that apart from the way they are known in consciousness, such unconscious states have no aspectual shape then establishes his case by definition. It is possible, however, to disagree.
One cannot fully know the shape of another's conscious thoughts, but this does not preclude knowing something, from observations of behaviour, about whether their thoughts are directed to one aspect of a thing rather than another. Perhaps in some future neuroscience such behavioural observations could be combined with direct observations of the brain. But we don't have to wait for the arrival of Searle's "brain-o-scope" to determine, say, whether a chemist wants water or H2O. If he drinks it we assume it's water, and if he sticks it in a test tube, it's H2O!
Nor is it self-evident that the only sense in which physiological states have an aspectual shape is in so far as these are manifest in conscious experience. It is likely, for example, that all neural representations of internal or external events code those events under some aspects and not others. Indeed it is difficult to envisage how any representational system could be constructed differently. And if this is so, it is difficult to argue on these grounds that shallow unconscious representational states are "mental" whereas deep unconscious representational states are not.
Moreover, Searle's intuition that only that which is potentially conscious is truly "mental", needs to be set against ancient, competing intuitions. To have a "mind" is also to have certain modes of functioning and capacities, an intuition dating back to Aristotle, which recurs in Descartes' attempts to demonstrate that man cannot be just a machine, on the grounds that no machine could ever use language or respond appropriately to continually changing circumstances in the ways that humans do. It is hardly surprising, therefore, that modern cognitive science has attempted to uncover the mental processes which enable human adaptive functioning (whether these be conscious or not). At what point mental "software" is better thought of as neurological "hardware" remains an open question, but this does not make the attempt to specify the software any less legitimate.
In focusing on consciousness, Searle usefully draws our attention to a central facet of the mind that is largely missing in current, Functionalist, cognitive science. But in dismissing cognitive capacities or modes of functioning as further criteria of the "mental", he plumps for a definition that is equally incomplete. Mental processes do not either produce consciousness or permit us to function in certain ways, but under certain circumstances achieve both. A science of the mind, therefore, could never be complete without addressing this duality.