Searle, John
 
 
(b. 1932, Denver, CO; Ph.D. philosophy, Oxford; currently Professor of Philosophy, UC Berkeley.) In philosophy of mind, Searle is known for his critique of computationalism, his theory of intentionality, and his work on the problem of consciousness. See Chinese room; intentionality; intention-in- action; aspectual shape; prior intention; The Background.
 

Details:
 
Searle took his Ph.D. in philosophy at Oxford, where he studied under John Austin and later became Lecturer in Philosophy at Christ Church from 1957-1959. Subsequently he went to UC Berkeley, where he became Professor of Philosophy. Searle's early work was in speech act theory, culminating in (1969) and (1979). He is credited with having elaborated the theory of speech acts associated with Austin, and with having introduced into the theory original elements of his own, most notably regarding the role played by speakers' and receivers' intentions in constituting the meaning of speech acts. Consistent with the focus on intentionality, his interest turned to philosophy of mind, where his major work can be seen as consisting in three main efforts: a critique of computationalism and strong Artificial Intelligence (AI); the development of a theory of intentionality; and the formulation of a naturalized theory of consciousness.
 
 
The Critique of Computationalism and Strong AI
 
The best known example of Searle's critique of computationalism and strong AI is his Chinese Room Argument (see separate entry). The main thrust of this thought experiment was to show that the syntactic manipulation of formal symbols does not by itself constitute a semantics. The implications for computationalism and strong AI were held to be the following: first, computationalism fails because the formal syntax of a computer program has been shown not to be intrinsically semantic, and second, strong AI fails because a system's behaving as if it had mental states is insufficient to establish that it does in fact have these states. Interestingly, Searle's assertion that syntax is insufficient to establish semantics predates the Chinese Room Argument and in fact represents one of the main objections to the generative grammar program that he voiced back in the early 1970s (e.g., 1972).
 
More recently (1997), Searle has argued that the Chinese Room Argument granted too much to computationalism. As he sees it now, the argument wrongly took as unproblematic the assumption that computer programs are syntactic or symbolic in the first place. Instead, he argues that there is no fact intrinsic to the physics of computers that make their operations syntactic or symbolic; rather, the ascription of syntax or symbolic operations to a computer program is a matter of human interpretation.
 
 
The Theory of Intentionality
 
Intentionality played an important role in Searle's philosophy going back as far as his early work in speech act theory. In (1983), he formulated a comprehensive theory of intentionality. (See also separate entry for intentionality.)
 
In (1983) Searle analyzes the intentional state as consisting of a representative content in a psychological mode. Although many representative contents consist in an entire proposition, many do not, and it is not necessary that they do. Searle also analyzes intentional states in terms of their directions of fit (which can be world-to-mind, mind-to-world, or null) and directions of causation (which can be mind-to-world or world-to-mind). (For further discussion of directions of fit and causation see separate entries on intention-in-action and prior intention.)
 
An important feature of Searle's theory of intentionality is something he calls The Background. The Background is theorized to be a set of skills, capacities, and presuppositions that, while being nonrepresentational, makes all representation possible. (For further discussion of The Background see separate entry.)
 
 
The Theory of Consciousness
 
Searle's theory of consciousness is given major exposition in (1992) as well as in the essays collected in (1997).
 
The first basic principle grounding Searle's theory of consciousness is that consciousness is irreducible. For Searle, consciousness is essentially a first-person, subjective phenomenon, and thus talk of conscious states cannot be reduced or eliminated in favor of third-person, objective talk about neural events. Any such attempt at reduction, Searle argues, simply misses the essential features of conscious states -- that is, their subjective qualities. (See also entry on aspectual shape.)
 
The second basic principle is that consciousness is as much an ordinary biological phenomenon as is digestion. It is from this principle that Searle derives an argument for a non-dualist, causal approach to the problem of consciousness. According to Searle, brain processes at the neural level cause conscious states; accordingly, conscious states just are features of the neurobiological substrate. Searle further argues that if consciousness is to be considered a feature or effect of brain processes, we must be clear to understand that it is not an effect separate from and posterior to the brain processes causing it. For Searle, this view of cause and effect is misleading when applied to consciousness because it unavoidably leads to dualism, which is untenable. Instead, Searle argues that the relation between consciousness and its causal brain processes involves a kind of non-event causation such as would explain the fact that gravity (a non-event) causes an object to exert pressure on an underlying surface. Searle has put the point another way by describing consciousness as an emergent property of brain processes in the same sense that water's liquidity is an emergent property of the behavior of H2O molecules.
 
It should be noted that Searle's biological naturalism does not entail that brains and only brains can cause consciousness. Searle is careful to point out that while it appears to be the case that certain brain functions are be sufficient for producing conscious states, our current state of neurobiological knowledge prevents us from concluding that they are necessary for producing consciousness.
 
 
Daniel Barbiero