5 Consciousness: The Organismic Approach

19
This article was downloaded by: [Adelphi University] On: 23 August 2014, At: 00:32 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Neuropsychoanalysis: An Interdisciplinary Journal for Psychoanalysis and the Neurosciences Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/rnpa20 Consciousness: The Organismic Approach Philip Clapson a a P.O. Box 38225, London NW3 5XT, United Kingdom, e-mail: Published online: 09 Jan 2014. To cite this article: Philip Clapson (2001) Consciousness: The Organismic Approach, Neuropsychoanalysis: An Interdisciplinary Journal for Psychoanalysis and the Neurosciences, 3:2, 203-220, DOI: 10.1080/15294145.2001.10773356 To link to this article: http://dx.doi.org/10.1080/15294145.2001.10773356 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

description

Neuropsychoanalysis

Transcript of 5 Consciousness: The Organismic Approach

Page 1: 5 Consciousness: The Organismic Approach

This article was downloaded by: [Adelphi University]On: 23 August 2014, At: 00:32Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: MortimerHouse, 37-41 Mortimer Street, London W1T 3JH, UK

Neuropsychoanalysis: An Interdisciplinary Journalfor Psychoanalysis and the NeurosciencesPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/rnpa20

Consciousness: The Organismic ApproachPhilip Clapsona

a P.O. Box 38225, London NW3 5XT, United Kingdom, e-mail:Published online: 09 Jan 2014.

To cite this article: Philip Clapson (2001) Consciousness: The Organismic Approach, Neuropsychoanalysis: AnInterdisciplinary Journal for Psychoanalysis and the Neurosciences, 3:2, 203-220, DOI: 10.1080/15294145.2001.10773356

To link to this article: http://dx.doi.org/10.1080/15294145.2001.10773356

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose ofthe Content. Any opinions and views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be reliedupon and should be independently verified with primary sources of information. Taylor and Francis shallnot be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and otherliabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: 5 Consciousness: The Organismic Approach

203

Consciousness: The Organismic Approach

Philip Clapson (London)

Abstract: In recent years, an approach has been emergingwhich might be called organismic. It seeks to position conscious­ness, and thus human experience, in the biological processes ofthe organism. If accepted, it will fundamentally change our self­understanding and our understanding of organisms in general.

The approach is seen in different ),vriters, but not always fullycarried through, or a particular emphasis may disguise the fullimplications. The aim here is not to provide a comparative cri­tique, but to try to stabilize some central principles and appreciatetheir ilnport, and propose a research program.

Background: The Inner World

The Notion of Soul

To begin with, I want to introduce the idea that, eventoday, the notion of soul lies behind our thinking,though it does not match the nature of the universe aswe understand it. In fact, the term is not often used(by comparison with the self, for example). Still itretains a potency that no other term has. It has threeimportant characteristics. These are relatively trans­cultural.

1. Being. Ignoring the immortal and immaterial as­pects (which historically have been of crucial sig­nificance), the soul is that by which a person isunderstood to be what they are. It is inside the(living) body and represents the essence of the per­son's characteristics, and is crucially influential ontheir actions and destiny.

Philip Clapson is a philosopher working in London, who has beendeveloping his theory of the biology of experience over a number of years.

Acknowledgment. I would like to thank Mark Solms for helpful dis­cussions on an early version of this paper.

I Person and mind are other substitutes, as will be discussed. Note:The argument to follow is not supposing the soul as a homunculus, as inthe argUlnent against an inner observer of consciousness, or subject pullingthe strings of behavior.

2. Enduring. The soul endures. From birth to deaththe soul provides continuity.

3. Experiencing. Ultimately, and perhaps at the sur­face, it is the soul that experiences (is glad, suf­fers, etc.).

For us, in principle, the category human beingimplies uniqueness, sanctity, and ultimate value. Thenotion of soul functions not merely, or even perhapsmostly, as an explicit category, but as a way of grasp­ing preconceptually this categorial significance. Thusin Lakoff's terms (1987), it is a prototype, even meto­nomic. The notion of person has become the politicaland social substitute of choice; but person is vagueoutside its contextual use.

Since the notion of soul in its metaphysical as­pects is commonly unacceptable in modern thinking(outside a specifically religious framewor k), its use is,in Lakoff's terms, metaphorical. When it is said thatsomething touches the soul, it is understood that thenature of the experience is thereby expressed, but notexplained. I shall return to the point of this later, butturn now to history.

Descartes

The difficulty Descartes introduced was the divisionbetween a thinking substance, the mind, and the mate­rial world, including the human body. Although Des­cartes did not use the term consciousness, it isaccepted that what he was talking about was whatcame to be called consciousness (for a history of theterm, see e.g., Humphrey, 1992).

Truly divisive was Descartes' conception of thecausal relevance of the mind (1985). He expressly un­derstood the mind to be the harborer of experiences(thoughts, sensations, etc.) which, being consideredand understood by the subject, or I, facilitated the initi­ation of action. An injured foot causes signals that

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 3: 5 Consciousness: The Organismic Approach

204

make their way to the brain, which, transferred to themind, are felt, and this causes, or allows the I (of, oras, the mind) to initiate remedial and precautionaryaction. Descartes' language, as is characteristic of thiskind of discussion, is not precise. Mind, self, soul, andI are often interchangeably deployed. Descartes wasaware of the brain and that it had great powers; butthe true being of the human was the mind as a distinctsubstance with its own characteristics.

There seelns to be no quibble that our experienceinvolves the awareness of the world, and of ourthoughts and feelings. There seems to be, in consciousexperience, something grasped; and indeed the possi­bility of grasping anything, about the world or our­selves, seems to depend upon the fact of conscious­ness.

Mind as Us and as Organ

Mind has a hybrid or dimodal existence as thing andoperator. While many subsequent writers were not soconfident of, even opposed, its separateness from thebody, it retained this ambiguous nature. That we expe­rience lent credence both to the idea that it is the mindthat experiences, as perceptions, feelings, andthoughts, and also that, as evidenced by the will, it isthe I of the mind that acts, referencing thoughts ormotives as the cause or justification of those acts.

However, whilst our experiencing can seem to bewelded to the world, as in direct perception or decisionmaking and acting, also a teeming quantity of thoughtsand fancies, feelings and imaginings occur to us allthe time. These seem unrelated to that with which weare involved, and occur without any apparent intentionon our part. Thus, whilst there is a way in which themind is associated with our selves as active agents,representing us as world operator employing ourbodily being for its purposes, at the same time, themind is itself filled with apparently independent activ­ity over which we have no control. This renders un­convincing the idea that "we" (as the mind) are incontrol of it.

The polar extremes may be seen with the ego­centered philosophy of Fichte (1994), in which theworld (as represented in the mind) is derived from theego; and the philosophy of Schopenhauer (1818), inwhich the forces of nature expressed as Will determinewhat the mind undergoes and accomplishes. ForFichte, the individual has absolute responsibility sincethe mind is derived out of the ego; with Schopenhauerthere is no such responsibility since the mind is simply

Philip Clapson

where (or how) the forces of world as Will representa­tionally act themselves out.

For Hume, the I becomes opaque to the mind, forthere is no way of looking into the mind to find the Iwhilst it is looking into itself.2 But where, then, is theI? Kant (1781) attempted to resolve Hume's difficultyby making the I the logical requirement of experienceitself (thence Fichte). But he also gave this I a noume­nal character to elevate it from the causal forces of theworld (thence free will). His solution has not foundmuch favor (thence Schopenhauer).

The mind, characterized thus, is deeply problem­atic. It seems to defy a simple account or to be func­tionally consistently explainable; and besides which:(1) What ontologically and operationally, exactly isit? (2) How can its operation in the physical universebe understood?

Consciousness and Nonconsciousness, or theUnconscious

The (iconic) mental categories of the early mod­erns-thoughts, feelings, perceptions, and so on (theproducts and mechanisms of the mind)-were an at­tempt to individuate and explain our psychological life(reference is to Locke, Berkeley, Hume). These kindsof categories still dominate philosophical and psycho­logical discourse. The problem is, where are they? Noone has seen a feeling or a thought, and a look intothe brain will not reveal them. They are, of course,explanatory categories of behavior, or labels attachedto conscious experience. But they are deployed as ifthey really existed, doing the job we take them to do.It is on this assumption that both Functionalism andthe classical computer model of the mind became dom­inant in the late twentieth century. Looking back, thecomputer model was born of new-gismo enthusiasmand hope rather than insight. For consciousness perse, from which the idea of mentality originates, hadto be explicitly ignored. Its actual behavior is notclean-cut in the way theorists of the mind required.

Moreover, Descartes' view of the mind, centeredupon consciousness and its reflexivity, was explanato­rily inadequate because it did not comprehensively ac­count for human activity. By the time Freud developedhis two successive metapsychologies (psychoanalytictheory), the idea of the unconscious was widely ac-

2 This reading takes into account Hume's dissatisfaction, in the Ap­pendix, with his own account in · 'Personal Identity" in the text of theTreatise, Part iv., Sec. vi., Hume (] 739-] 740).

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 4: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

cepted. Most damaging for Descartes' position, Freudshowed that consciousness was not transparent; an in­dividual's motives could be unknown to themselves,and their explanation of themselves false. Those mo­tives were unconscious.

Both philosophy and cognitive psychology havefound Freud's theories difficult because they rob con­sciousness, or more appositely mentality, of its reason­ing essence, and therefore motivational tractability.Freud was in a line from Schopenhauer throughNietzsche.

By developing an account of consciousness thatbegins in the body and instinctual drives, Freud triedto bridge between the organic and the mental. In thefirst theory he proposed that consciousness was, in asense, a filtered operation-only so much was letthrough by a censor, the determining factor being so­cial (or self-regarding) acceptability. In the second the­ory, the ego constantly defends itself from the primalforces of the id and the prohibitive aims of the super­ego, which are at odds with its' 'reality principle." Inthe second theory, the instinctual depths of the id arenecessarily' 'unavailable" to consciousness.3

Though Freud's theories are seen as problematic,cognitive psychology has embraced the nonconscious,for otherwise much of human activity-the processof thinking, for example, as opposed to the result ofthinking-is inexplicable (cf. LeDoux, 1998; Seager,1999). In this, psychology mimics Freud's position.Reference to consciousness per se becomes irrelevant,for talk of the mental encompasses the conscious andthe seamlessly linked and functionally symbiotic non­conscious. The actuality of consciousness could bediscretely forgotten.

Freud's division of the mental realm, and its in­ternal warfare, represents an inner mental world, asdoes cognitive psychology's notion of the supportivenonconscious to the conscious as mind.

The Mind-Body Problem

This topic lies at the heart of what is to follow, butnot in the terms in which it has received so muchattention: the attempt to reconcile, reduce, or eliminateone of the disparate entities. The problem is that men­tality seems not the same as physicality. For Descartes

3 Freud's early' 'Project for a Scientific Psychology" (1895) was hissubsequently abandoned attempt to create a fully materialistic neural ac­count of hun1an functioning. But he never left behind either the biologicalapproach, or its terminology (Sulloway, 1983). For an overview of Freud'stheories see Laplanche and Pontalis (1973).

205

this was an ontological distinction: For we modernswhat is involved in the distinction remains contro­versial.

The arguments turn upon a simply expressedidea: If, for example, I see a bunch of red roses, andthey appeal to me and I decide to buy some, I under­stand my feelings, thoughts, and actions to arise withinmy experiencing. I go up to the vendor and say, "Yourroses are splendid; I'd like a bunch." My apparentlyvoluntary acting and speaking seem to be causedwithin and from my experiencing. But at the sametime I also suppose that my brain, which is in someway the same as (or generating) my experience, alsocarries out operations that result in whatever I do asorganism in the world. The sensory and neural pro­cesses of my vision system interact with the neuralprocesses "underpinning" my desire and judgment(in, for example, the hypothalamus, hippocampus, andprefrontal cortex), followed by directed activity in mymotor and vocal systems, thus providing a strictly de­termined physical account of the seeing and purchas­ing of the roses. While these two things (experience­brain) must be the same in some way, they do notappear to be, not least in terms of causal explanation.And given a view of the world that says that causeand effect are physically determined, and since thebrain (and the extended nervous system) are the physi­cal domain, the position of my experiencing could beseen to be superfluous. But it doesn't seem to me tobe superfluous, nor to people at large. Hence the proj­ect to solve, or dissolve, the problem.

However, at its heart lies a supposition that, inthe Anglo-American philosophical tradition, and psy­chology and neuroscience, has hardly registered. If itis false, then the centuries of split ink will have beenthe attempt to solve a non-, or misperceived, problem.The supposition is that, biologically, experience, andwhat the brain and nervous system do, are in fact thesame thing. More precisely, it only becomes a neces­sity to perform the reconciliation, reduction, or elimi­nation if the biological aim of each domain is directedtoward the same end. At least, those are the terms inwhich, implicitly, the problem has been posed.4

What terms? The soul notion. Whilst the actualityis rejected, the operative significance remains. My ex­perience is what I am in mentalist terms, just as neu­ronal processes are what I am in physical terms. Itseems straightforward to say: My circulating blood ispart of me; my neuronal processes are part of me; and

-+ With two causal realms, for example, there is the problem of causaloverdetermination.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 5: 5 Consciousness: The Organismic Approach

206

my thoughts and feelings are part of me. The clue isthe "part of me"; for whilst there is no actual soul,the "me" substitutes it. "Isn't it all going on insideme? We're just talking about what is inside the physi­cal boundary."5 But the inclusion of all these elements(blood, neurons, thoughts) is the soul notion now de­fined by the physical boundary. Returning to our threesoul characteristics, we see they are operative: (1) In­side the boundary is the being of the individual. (2)Over a lifetime this boundary, defined by the DNA,even if bits are lopped off or artificially welded on,identifies what is the individual as enduring. Whendead, unless cryonically maintained, the boundarygoes. (3) Experiences are inside the boundary. Theindividual cannot experience outside themselves, foreven "out-of-body" experiences are deemed to resultfrom brain states within the boundary. The boundarydefines what are the individual's experiences, even ifnot exactly what it is that is experiencing or whatexperiencing is.

By these lights, the mind-body problem is indeedproblematic. But if these lights misportray the situa­tion, then the problem isn't there. How could this be?

Bewildering Experience

What is not at issue is that whatever goes on in thebrain and nervous system is operating to effect whatthe body does. If we could imagine the body operatingwithout experience, then we would be dealing with a"pure" physical entity acting in the world. We wouldbe dealing with one of the philosophers' favorite char­acters: the zombie. This entity has sensory organs, anoperative body, a ratiocinating brain full of plans andprograms designed to achieve prespecified and devel­oping goals. It is, in fact, just the same as a humanbeing except that it does not experience. And for suchphilosophers as David Chalmers (1996), therein is themystery: Why do humans experience? What is itspoint?

I put this slightly differently from the strictly non­reductive thesis of Chalmers and fellow travelers, be­cause the nonreductiveness is really secondary. Thecentral and positive question is: What is experiencingfor? Its nonreductiveness or potentially epiphenome­nal character only become an issue if no biologicalpurpose for experiencing is identifiable. And the diffi­culty lies exactly where Descartes posed his solution.

.') About this most writers are absolutely explicit, for example Dennett(1991, 1996) and Damasio (1999) see below.

Philip Clapson

If there is damage to the foot, the signals travel to thebrain which, transferred to the mind, results in pain,which alerts the I of the mind (or the mind itself) totake remedial or preventative action. But our zombiedoes not need all this mind stuff because all that isinvolved in the process, including the remedial andpreventative action, can be carried out without the in­tervention of (experienced) feeling. For all that is (re­ally) involved (i.e., at the neuronal reductive level)is information processing. Thus our experiencing isapparently superfluous, mysterious, beyond physicalexplanation.

Well, for Descartes it was information pro­cessing, but it took place in a mental substance di­vorced from the body, transduced from the bodythrough the pineal gland. The problem is for Chalmerset aI., if it is not divorced from the body, why does itexist over and above that which is truly necessary forinformation processing, the neuronal channels of thebody with their electrochemical transmitters?

Into Biology

Transition to the Organismic Approach-VariousWriters

It is indicative of its power that even those who donot accept nonreductive positions like Chalmers' canfind themselves snared by the soul notion.

The organismic approach resists what conscious­ness appears to reveal "to us." It is de facto anti­Cartesian. Why? Because what the organism does itmust do without recourse to what we appear to be as"a knowing thing" as consciousness, unless we insiston elevating it above biology (cf. Clark' s final chapter,1997). There is no way of creating a divide betweenorganism and experience, and granting experiencesome grounds the organism does not have, withoutaccepting the sense, if not the letter, of Descartes' pro­posal (otherwise, in Ryle's words [1949], the ghostremains in the machine).

John Searle's (1992) particular physicalism at­tempts to do so by saying that brains are of such abiological character that consciousness (whatever thatis deemed to be) just is its result. Thus, mental pro­cesses in consciousness-like introspection and trans­parency-can be understood as a given of the natureof the emergent domain consciousness is. But, asmany understand it, this is nonexplanatory and/or im­plausible.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 6: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

Daniel Dennett, on the other hand, avows a kindof organismic approach. "From the outset I workedfrom the 'third-person' point of view of science, andtook my task to be building ... a physical structurethat could be seen to accomplish the puzzling legeder­main of the mind" (1994, pp. 236-237). Dennett's(varying) positions divide into two groups: The firstconcerns intentionality made manifest by "mere ma­chinery" -the decomposed homunculi of mentaltasks, for example (his topic of content).6 The secondrelates how the brain actually operates as, or gener­ates, consciousness-his multiple drafts model, for ex­ample (his topic of consciousness).

Before looking at Dennett's position, it will helpto be more precise about what is involved in the organ­ismic approach. There are four points:

1. Brains and nervous systems appear to do all thework in the physical universe of operating the hu­man being.

2. We have (are) experience, which it is understoodresults from consciousness, which itself resultsfrom, or is an aspect of, the brain's activity.

3. There is an explanatory set of concepts called psy­chology (various actually) which account for hu­man behavior, and, to varying degrees, incorporatethe apparent nature of consciousness.

4. There is, in some as yet unspecified way, the ques­tion of what the above says about what is going onin human biology.

The organismic approach must take point 1 notonly as mandatory but also as precedent. Philosophersof mind (usually) think about the person reversing theprecedent order between points 1 and 2. Dennett ex­plicitly takes the (normally) antiphilosophical route.

But having done that, we return to point 4. Whatis the biological position of consciousness, point 2?For it is the fact of point 2 that generates the possibilityof point 3, psychological explanation.

Dennett's way of dealing with this is twofold.First, Dennett does not believe in the actuality of point3, psychological explanation, because as a robustphysicalist, he considers the intentionality of a humanbeing "as if"; the organism is to be interpreted as ifit believed and desired (for example), whereas it onlyseems to because actually it is just (bio-)mechanics(see his Intentional Stance, 1987).

6 Intentionality, or "aboutness" as is often said now, is the mark ofthe mental according to Brentano (1874), a term of scholastic origin.

207

Second, in the case of consciousness, the ob­server (third-person) heterophenomenological methodinterprets the untrustworthy subject's (first person)phenomenological account. A subject's self-under­standing is (anthropologically) edited and contextual­ized by the observer, and thus given objectivecredence. What the first person's phenomenologyseems to be, a happening inside them that is observedby them, is just the judgment they have and can ex­press; there is no Cartesian Theater, no (added) ob­server-subject of the phenomenological "scene" overand above the judgment the subject (can) talk about(1991).

Moreover, Dennett's view is that there is no self,but a Center of Narrative Gravity: the things we tendto say because those are the ways we have (learned)to be. Externally (just as in the Intentional Stance)there is an appearance of self because these same orsimilar things issue from the same mouth (along withother behavior). Inside it could never be found.

But in spite of all of this, and contrary to hisavowal, Dennett's position has not left the soul notionbehind. Consider these sentences.

Mental contents become conscious not by enteringsome special chamber in the brain, not by being trans­duced into some privileged and mysterious medium,but by winning the competition against other mentalcontents for domination in the control of behaviorand hence for achieving long-lasting effects-or aswe misleadingly say, "entering memory." And sincewe are talkers, and since talking to ourselves is oneof our most influential activities, one of the most ef­fective ways for a mental content to become influen­tial is for it to get into position to drive the language­using parts of the controls [1996, pp. 205-206].

Dennett's view here is that the brain "harbors"mental contents that are vying to become current con­sciousness. But the brain has no mental contents.7 Cur­rent best theory says the brain is a massive modularneural network, operating by electrochemical trans­mission. Dennett concurs with this: for mental means(for example) intentional, which for Dennett is "asif. " What he really means, presumably, is that neuralstates vie with each other to become conscious-andapparently intentional. But why are neural states vyingto become conscious? Answer: to be influential. On

7 Cf. "If your model of how pain is a product of brain activity stillhas a box in it labeled 'pain,' you haven't yet begun to explain what painis" (Dennett, 1991, pp. 454-455).

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 7: 5 Consciousness: The Organismic Approach

208

what? Behavior. Why? How? Dennett never convinc­ingly explains.

Since, as is well known, our behavior is exten­sively controlled by neural processes that are neverconscious, including some decision taking,8 and muchthat we are conscious of seems irrelevant to our cur­rent behavior,9 and sometimes as consciousness wesay things differently from what we actually intendand do, and as Freud pointed out we often do thingsexplaining ourselves for reasons that are not why weare doing them at all (see Nisbett and Ross, 1980, forexperimental results), what is needed is an explanationof the significance of those that actually are conscious.Thus Dennett's account, that our judgments, our narra­tive, our images control our behavior, simply does notadequately explain either why they are actually thereas they are-or not, while behavior still goes on. Thesoul might necessarily meaningfully control (or Des­cartes' mind with its clearly thought self-understand­ing), but this is not what consciousness necessarilydoes. lo Moreover, even if, for Dennett, consciousstates are brain states, he is still left with the apparentduality of domains controlling our behavior: viz, whydoes control have to be this emergent consciousnessas and from the physicality?

He says that: "Such questions betray a deep con­fusion, for they presuppose that what you are is some­thing else, some Cartesian res cogitans in addition toall this brain-and-body activity" (1997, p. 206). Butthat is not the issue. The issue is: Why/how do justthese images, words, feelings, moods, thoughts controlwhat the organism does? What is the relation betweenmy seeing the roses and my buying some explained asbrain function? What functional contribution to whatI do (as organism) is it that I have a seeing (or thoughtor feeling)-point 4? The soul notion may embracedomains as me or mine, but it cannot tacitly be as­sumed to explain their interconnection; it does not.

The neurobiologist, Antonio Damasio, whosework The Feeling of What Happens (1999) has beenwidely discussed, takes an explicitly organismic ap­proach. "Consciousness begins when brains acquirethe power ... of telling a story without words, thestory that there is life ticking away in an organism,and that the states of the living organism, within bodybounds, are continuously being altered by encounters

R Experiments of unconsciously influenced decision making (e.g., Da­masio, 1999, p. 301).

9 Not only our rich explicit fantasy life, but much else besides.10 Memory, for example, is procedural as well as factual and personal;

but procedural memory (e.g., learning to ride a bicycle) is not renderedby conscious conceptual thought.

Philip Clapson

with objects or events in its environment, or, for thatmatter, by thoughts and by internal adjustments of thelife process" (p. 30).

The crucial move Damasio makes is to locateconsciousness as an organismic biological compo­nent. 11 What that component does, in principle, is por­tray the encounter between organism and world(including itself as "inner" world). It is a story oforganismic engagement. So Damasio takes on point 4directly. Thus he disengages the intuition of con­sciousness as the viewed scene of the observer, Den­nett's Cartesian Theater. Moreover, and more true tothe phenomenology, he does not insist on Dennett'sunconvincing essential tie of consciousness and thecontrol of behavior.

Damasio takes as a central fact the findings ofLibet, Curtis, Wright, and Pearl (1983). In the lasttwo decades, Libet's location of the delay betweenthe nonconscious beginnings of voluntary action andawareness of same have carried the implication thatconsciousness (and thus what we experience) cannotbe a state that has its own powers of initiation orcontrol of action. 12 For consciousness seems to comeat the end of the brain (and nervous system) processingcycle, when action (for example) has already been ini­tiated; the delay is between 350 and 500 msec.

Damasio's original project (1994) along theselines was to rehabilitate emotion into an account ofthe reasoning processes of the human organism.Broader now, as the title of this book indicates, helooks to link the facts of experiencing and the facts ofneurophysiology, thus to position consciousness in thebiology of the organism. The following summarizesvarious elements of Damasio' s position:

The idea of consciousness as a feeling of knowing isconsistent with the important fact I adduced regardingthe brain structures most closely related to conscious­ness: such structures, from those that support theproto-self to those that support second-order map-

II One reason for this is that Damasio does not come from, or endorse,the computational view.

12 For example, "The brain evidently 'decides' to initiate, or, at least,prepares to initiate the act at a time before there is any reportable subjec­tive awareness that such a decision has taken place. It is concluded thatcerebral activity even of a spontaneously voluntary act can and usuallydoes begin unconsciously" (Libet et aI., 1983, p. 640). However, Libet'sown explanation of this, that consciousness acts potentially to veto uncon­sciously initiated action, is dualistic and implausible, since any veto con­sciousness imposes must itself be unconsciously initiated 350 to 500 msecpreviously, as commentators have pointed out.

Libet's findings actually support our conlmon experience. In fastsporting action "decisions" are taken before awareness of them. And weduck before hearing the thunder, when it is overhead.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 8: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

pings, process body signals of one sort or another,from those in the internal milieu to those in the mus­culoskeletal frame. All of those structures operatewith the nonverbal vocabulary of feelings. It is thusplausible that the neural patterns which arise fromactivity in those structures are the basis for the sort ofmental images we call feelings. The secret of makingconsciousness may well be this: that the plotting ofa relationship between any object and the organismbecomes the feeling of a feeling (1999, p. 313].

In being a "feeling of a feeling," Damasio isattributing consciousness to a second-order neuralstate "knowing," or being the feeling of the therebyknown, first-order feeling state. He is not explaininghow this can happen, but hypothesizing both what andwhere the mechanisms may be, and their functionalsignificance. For the "being known" experienced asconsciousness raises it to functional significance bybeing the known process of an experiencer. "Con­sciousness is thus valuable because it introduces a newmeans of achieving homeostasis ... [e.g.] handl[ing]the problem of how an individual organism may copewith environmental challenges not predicted in its ba­sic design such that conditions fundamental for sur­vival can still be met" (p. 303).

This value, at its most basic level, is that: "Thesimple process of feeling begins to give the organismincentive to heed the results of emoting" (p. 284).Emoting is not a conscious state but an organismicstate in relation to an object that can become feeling,and then consciously known when it is second-order."Suffering begins with feelings, although it is en­hanced by knowing, and the same can be said for joy"(p. 284). The value of this is: "That the mechanismswhich permit consciousness may have prevailed [inevolution] because it was useful for organisms toknow of their emotions ... and it became applicablenot just to the emotions but to the many stimuli whichbrought them into action" (p. 285).

Now although Damasio's account of conscious­ness as a portrayal of neural states may be an advanceover Dennett's control account, a problem remains.And this is in precisely the same place. An emotionrises from a neural state to (another neural state of)feeling to (yet another neural state of) a knowing offeeling. What exactly is being explained thereby?What domain is inhabited by feeling and the (con­scious) feeling of feeling? And why, moreover, shouldthis be motivational (or an "incentive" in Damasio's

209

terms) for the organism?l3 While neural states func­tion, we are simply adding in another stratum of ap­parentness, that of the mind where all thisrepresentational occurrence is taking place. And whyshould this bring evolutionary benefits? That there areoccurrences does not explain why those occurrencesfunction as claimed. Specifically, a feeling of paindoes not explain its function, since in Descartes' ac­count, the mind has to be assumed for the function tooperate, and so far no fundamental explanation of themind exists.

Moreover, as the Libet data suggest, everythingthat is involved in the causal process of seeing theroses and deciding to buy occurs unconsciously. Weare left purely with the results of the process (the neu­ral portrayal) as what is conscious. Consciously I de­cide nothing. My ccnsciousness simply follows alongbehind what my brain is deciding nonconsciously."The idea that consciousness is tardy, relative to theentity that initiates the process of consciousness, issupported by Benjamin Libet's pioneering experi­ments," Damasio says (p. 287).

Thus Damasio's account remains on paralleltracks. The nonreconciliation is caused, we diagnose,by-the soul notion; because, in the principle of thesoul, whatever is going on in the organism has to bein and for the organism qua organism; both neuralstates and their conscious portrayal: isolated, solipsis­tic, self-operative.

Damasio correctly begins by establishing the pre­cedence of point 1 (organism) over 2 (consciousness),and has an attempt at explanation of point 4 (biology).But because point 3 (psychology) is still his operativeunderstanding, he lapses in his further explanation toa position where points 1 and 2 are equal and irrecon­cilable.

Indeed, it has been said truly that neurosciencehas not yet done its job: It has no explanatory domainfor the neuro- but simply grafts psychological con­cepts onto brain locations and their interconnections.In the words of J. Graham Beaumont (1999): "Neuro­psychology is in a conceptual morass. Neuropsycholo­gists seek to study the relationship between brain andmind, but without ever really addressing the statusof these two constructs, or what potential forms therelationship between them might take" (p. 527).

The wretched ghost still haunts the feast.Some writers, on Libet et al.' s findings, have

come to the conclusion that there is something myste-

13 That feelings (or sensations) are motivational is a standard premisein neuroscience and neuropsychology.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 9: 5 Consciousness: The Organismic Approach

210

rious going on about what is happening as conscious­ness. 14 One such is Guy Claxton. In his paper,"Whodunnit? Unpicking the 'Seems' of Free Will"(1999), Claxton attempts to reconcile Libet with thesense, in consciousness, that we have free will. Thenotion of free will (of the mind, self, soul, person,etc.), it has been long argued, does not coincide withthe determinism that neurophysiology implies.

The sense of free will, the seems of it in Claxton'sterms, is exactly knowledge occurring as conscious­ness to which Damasio refers. But this "knowledge"is clearly not factual or justified (in philosophers'terms). It is simply the that that the process of actingcan incorporate as a portrayal of the requisite neuralstates. When I buy the red roses, I seem to be (I havea sense that I am) acting under my own volition. Den­nett's heterophenomenology is apposite because, al­though I may say I act voluntarily, though mysubjective experience does not describe the reality inthe physical universe, that I describe myself in thismanner from my experience is an important fact abouthow neural states can portray themselves. Claxtongives an example this way:

Conscious premonitions are attached precisely to ac­tions that look as if they are going to happen-andthen sometimes don't. What folk psychology con­strues as "will power" -with its notorious' 'fallibili­ty" -turns out to be a property not of some imperfectagent but a design feature of the biocomputer. ... Anupdating of prediction [as when we appear to "changeour mind"] is reframed as an inner battle betweenconflicting intentions-and as further evidence of theexistence of the instigatory self [po 109].

Claxton supposes that the biocomputer-bywhich he means what the organism does, most ofwhich is unconscious-portrays itself in kinds of con­scious experience that we interpret psychologically.These require such "sleight-of-hand" concepts as:

[A] sense of self which includes a kind of dummyinstigator claiming authorship [for our actions]. Self­as-instigator is really a simple subroutine, added tothe biocomputer, which does not affect the latter'smodus operandi at all, but simply takes the glimmer­ings of a naturally-arising prediction, and instantlygenerates a "command" to bring about what was

14 In Continental philosophy, deconstruction of consciousness hap­pened much earlier in the twentieth century, from Husserl to Heidegger andMerleau Ponty; see more recently Varela, Thompson, and Rosch (1991),particularly chapter 8.

Philip Clapson

going to happen anyway [po 111; slightly changedfor quote].

Claxton recruits Libel's findings to mark the dif­ferentiation between what the organism does in itsmodus operandi and what is portrayed as, for example,the' 'self-as-instigator." 15 This is undoubtedly in theorganismic camp.16 The features raised, put togetherwith others in this section, build an emerging picturewhich we will address explicitly later in this paper.

But still, of his account, we must ask exactly whatthe point of there being both a biocomputer and con­scious experience which is deceptive of its actual pow­ers? For it must be the biocomputer that causes theconscious experience, and if the conscious experienceis, in some sense, a fake or an illusion or decep­tive 17-all these words are used in the literature todescribe our experience as opposed to some other(e.g., neural) actuality: (1) What is the point of ourexperience in the first place? (2) Why is experienc­ing's illusory nature, in this account, simply (in Clax­ton's words) "comforting"? (To what, the soul?)Claxton does not address either of these points.

A similar problem arises with the views of PeterHalligan and David Oakley in their New Scientist arti­cle (2000), which refers to (and endorses) Claxton'spaper. Halligan and Oakley confuse a mentalist modelof the conscious-unconscious with the model con­scious-neural-their ' 'unconscious parts of thebrain." There is an assumption that whatever neuralprocessing is doing, which is of course inaccessibleper se to consciousness, it supports the still justifiabledistinction, which Freud's work exemplifies, betweenan unconscious mind full of mechanisms, schemes,and plans, including popping things into conscious­ness, and a consciousness ("us"), which gullibly ac­cepts all these pulled strings of the unconscious. Theyfinish their article thus:

Perhaps by now you will have begun to think of your­self differently, to realise that "you" are not reallyin control. Nevertheless, it will be virtually impossi­ble to let go of the myth that the self and free willare integral functions of consciousness. The myth issomething we are strongly adapted to maintain, andalmost impossible to escape from. Maybe that's not

15 He also seems to endorse Libel's own account of the significanceof consciousness, which we reject.

16 Although, for similar reasons to Dennett, the computer analogy isnot helpful.

17 For example, The User Illusion is the title of Tor Norretranders'book on consciousness (1998).

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 10: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

so surprising, because, after all, "we" are part of theillusion [po 39].

But, as said of Dennett's probable slip of the pen,no equation of the unconscious and neural can bemade. It is highly probable that neural representation,as consciousness, gives a vision (image, picture, etc.)of the world, and feelings and language. But that doesnot mean that the brain is the unconscious as a physi­cal implementation of scheming psychological statesmanipulating consciousness. The brain, for example,has no feelings. All that is available to science is ourexperience as consciousness, behavior, and brain; allelse is psychology. And psychological explanation,useful in its own way, as scientific theory is unsatisfac­tory (as Paul Churchland [1981] contended long ago)and should not be claimed by neuroscience, which issupposed to be an account of brain function and itsmanifestations. This brief survey demonstrates that,although there has been steady progress on the organ­ismic view, it is still not a clear view.

The Organismic Approach

Preface

The aim, in this section, is to stabilize some centralprinciples, appreciate their import, and propose a re­search program.

When Freud derived human psychology from bi­ology he was faced with precisely the same problemas now. Recent brain science has been informative.Brain deficit analysis and imaging have specified loca­tion and participation in the process of generating psy­chological states. However, the transitions remainwhere they were. How are we to understand the movebetween: (1) brain states; (2) conscious experience;(3) psychological explanation? The initial step is todisengage entirely from the influence of the soul no­tion. We do this by concentrating on (4) biological ex­planation.

The assumptions about consciousness we havereviewed have not captured its biological function ina satisfactory way (as proposed at the beginning of"Into Biology' '). If we do find a satisfactory account,we should genuinely appreciate its evolutionary ad­vantage. But will it then actually be consciousness?For we must end up with no inexplicable perceptions,awarenesses, or invocations of knowing. Our accountmust be entirely adequate in physical terms, with cau­sality residing in the physical processes. Moreover,

211

regarding the organism qua organism, the accountmust enhance our understanding of it, not leave usbaffled. How do we do this?

We will address the task in two ways. First wewill attempt to grasp the phenomenon of conscious­ness directly to identify a function for it in the nextsection. Then we will pursue the argument for thatdiscovered function. Finally, we will review some ofthe ramifications, before proposing a researchprogram.

The Mind Banished

The first move is to investigate more precisely whatseems to be going on between brain states and con­sciousness. But now we consider consciousness itselfto be under question: There is a phenomenon but weare not sure what it is. Where it is the target of ourinvestigation, therefore, the word will be put in quotes:, 'consciousness.' ,

We might suppose that if you and I look at a tree,our "conscious" image of the tree will not be thatdifferent, although our interpretations of and associa­tions with the tree, and the world it is in, and othercurrent thoughts, will be. This is not to lapse into themyth of the given; it is to suppose that neural processesthat manufacture the tree's image will not be that dif­ferent. Again, although we will not have exactly thesame toothache pain, where it is manufactured by sim­ilar neural processes, it may feel similar. 18 Otherwisethe aim of neuroscience-to find the locations, inter­connections, neural firings, and particular neurotrans­mitters of conscious or behavioral events-would beimplausible. Besides, some neural activities are knownvery precisely in relation to their world targets. Giventhis, the notion of the tree being "in mind," or suchphrases, may be considered redundant. We could justsay, with Damasio, that the image is the portrayal ofa brain occurrence. Might this help?

It is widely agreed that conscious experience re­sults from heavy neural preprocessing (see, for exam­ple, Damasio, 1999; McCrone, 1999). This bothassembles the image (binding, in the brain, differentelements) and generates memories, ideas, and feelings.Combining Damasio' sand Libet's findings-our, 'conscious" state is a portrayal of the active brainstructures postdating the response of the organ­ism-we may see that the notion of a mind as intelligi-

HI This is not to say our pain will be the same for the same injury; cf.Patrick Wall (1999), below, footnote 19.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 11: 5 Consciousness: The Organismic Approach

212

bly doing things is redundant, an unhelpful fa~on deparler. For example, the notion that the mind associ­ates memories or thoughts with the image of the treeoffers nothing over and above the brain binding itsstructure to give whatever occurs. Suppose I think ofanother tree I saw, or a girl I once knew under a similartree, or a past tree struck by lightning: All that needsbe said is that the brain has bound whatever is appositeto its operation now. It is occurrent.

Consider further this line of argument. Every neu­ral state that is bound as "consciousness" expresses(portrays) the neural status of the brain. Therefore weneed not suppose there is a neural connection between,for example, memories qua memories. With the imageof the tree and the memory of a girl under a similartree, my "consciousness" does not thus remember thegirl by (causal) mental association. She is simply abound neural state. Of course, the girl may occur, asso­ciated with bindings of other neural elements that be­speak "this is the past girl under a similar tree," butmay not explicitly or immediately or ever. The girl­binding may be a neural process on some kind of neu­ral association, that is, neural connections laid downwhen the original experiences occurred as neuralstates, or linked to others for brain-biological function­ing. As they occur they may have the appearance, thatis, part of the way they are expressed as "conscious­ness," may involve the sense that there is a mentalistassociation. But as in Claxton's example of the senseof free will as causal, we would consider this an aspectof the portrayal process, not indicative of mental pow­ers. The girl may occur as a mere unrecognizedfragment.

Thus the argument would conclude that "con­sciousness" is not causal upon what happens in theorganism. Neural states cause each other; they causebehavior; they also cause what is "conscious." But a"consciousness" does not cause another' 'conscious­ness." Nor does it cause nonconscious activity of thebrain: that is, the brain does not read its own consciousstates on the supposition that otherwise it would notknow them (as is the case in, for example, Baars'sWork Space Theory of Consciousness [1996]).

This denial of conscious causality is most dramat­ically understood by pain not being a motivator. 19

However the neural states causing the experience ofpain achieve their result, the pain itself causes nothingin the organism. For what the organism does-in tak-

19 That the brain in fact regulates pain felt in appropriate contexts(i.e., that the experience is not a necessity of physically damaged states),is described by Patrick Wall in his book Pain: The Science of Suffering(1999).

Philip Clapson

ing action, for example-it does by silent neural statessubsequently represented as "consciousness."

But, while this holds the causality of the organismin the physical world, we are still left with the "con­scious" occurrence of our experience. We must con­clude, therefore, that our experience must be for someother purpose than internal causality.

"Consciousness" Is the Medium-Mechanism for thePortrayal of Neural Status for ExternalCommunication

And the inference to the best explanation must be thatit is to communicate the status of neural states to con­specifics.

Although "consciousness" and the experience itdelivers, are neural states, those neural states are di­rected outwards to the world. This (dis)solves thecausal mind-body problem (though not how "con­sciousness" is made). It also provides the biologicalfunctional account for our experiencing, that fits withLibet et al. It answers, too, J. Graham Beaumont's(1999) concern about the status of the construct"mind." The answer is: There is no such thing. Psy­chological explanation may remain in common use,but it is not neuroscientific.

But now the question is: How can "conscious"states communicate when they are internal to the or­ganism? This brings us to the difference between con­sciousness and' 'consciousness." To understand it, wemust appreciate the incredible difficulty of communi­cation between organisms that biology has to solvewithout perception, awareness, or knowledge invoca­tion. After all, organisms are just physical objects. Yetthey are brought to common action.

The organismic approach proposes that "con­sciousness" is a physical (neural) state of representa­tion. It represents to signify. It signifies to communicateto conspecifics how the organism finds the world, andis reacting to the world.20 Expressing these contents ex­hausts the function of "consciousness."

Signification is unproblematically physical. Butwhy does it happen?

For the organism to react to the environment, itneeds to represent aspects of the environment withinits brain so that, via sensory input from the environ­ment, it can select beneficial action by manipulating

20 Dennett (1991, pp. 410-4] 1) refers to something like this as "semi­otic materialism," quoting David Lodge.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 12: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

those representations. They need to bear no relationto the environment in terms of form or content (forexample, they could just be synaptic weightings in aneural network), but clearly they must bear a corre­spondence. This is relatively uncontroversial. But his­torically there has been an aim to show why theserepresentations (for "higher beings") involve con­sciousness; why, as the phrase goes, there is therebya world for the organism. We shall call this positionrepresentationalism.

But, suppose the organism needs to communicatewith conspecifics. How will it do it? Uncontrover­sially, it is supposed, by transmitted codifications (e.g.,signs, smells, sounds) that fit the representational ca­pability of the reciprocal organism. For we must sup­pose that, just as the environment is represented in theorganism, so must be the communications of conspe­cifics. For organism B to react to organism A, B needsto have some way of recognizing what A is communi­cating. Since neural structures operate in the brains ofA and B, what is involved in the communication mustbe the modification of B' s neural states by A's com­munication.

To react to the environment, the representationalstructure in the organism need not match the environ­ment in form and content. And this is true in B for A'scommunication. B need not have a representationalstructure that "pictures" A or A's environment. Allthat matters is that B can read A's "code." Moreover,in order to be able to respond to the input from A, Bmust first integrate its current states and other infor­mation it is receiving from the environment. This pro­cess may be as complex and intricate as necessary.Its form and functionality should be as adaptive andflexible as possible. It can be silent. 21

But it is different when considering that A com­municating to B involves B communicating to A. Forwhat concerns the organisms is not how they deal withthe input from each, but the nature of the reciprocatingoutput. What is now required is a structure that is ascommunicable as possible. What is required is consis­tency and appositeness. What is required is a formand content that the organisms can have adequatelyin common. Moving up the phylogenetic scale, whereability to manipulate the environment increases as or­ganisms increase in functional capacity, communica­tive representation likely will involve the form andcontent of the environment in which the conspecifics

2\ "Silent," in the text, will mean "not apparent"; that is, not animage or thought or word or feeling.

213

operate. For this is what they have separately but incommon, and need common command of.

And there is a requirement, too, that coexisting,or corelated elements of what is to be conveyed in thecommunication should be copresented, or integral, asthe communication. Significant interrelation of the el­ements also needs to be represented.22

Now to develop what is involved in this proposal,we build up the steps:

1. For what the organism takes in and manipu­lates of the environment and conspecifics, what isneeded is a fast, task-specific yet widely comparative/significance-balancing processing structure, to facili­tate behavioral response to a complex and changingenvironment.

2. For communicating to conspecifics, what isrequired is a reciprocal, apposite, and consistent codi­fication between the conspecifics.

3. Thus in the communication of A behaving inthe environment which B is to receive, A must assumein B that what it (A) is communicating can be receivedby B, and then reciprocated by some appropriate re­sponse.

4. If A is behaving in an environment which willinfluence B's behavior-for example if they are hunt­ing lions with a gazelle in view-A must suppose im­plicitly that B can represent "A's behavior in theenvironment" and respond to it.

5. Now consider what happens in B as a resultof receiving the communication of A. B processes thescene by its complex silent method that does not re­quire representational structures that have the form orcontent of the environment. What is involved in itsoutput to A? Well, B also cannot communicate theenvironment in which it responds to A; it also mustassume (implicitly) that A is able to represent "B'sbehavior in the environment" in A.

6. So, without any notion of the environment orbehavior or the gazelle or the conspecific, communica­tive success between the lions must assume all aspectsare reciprocally represented within them by somemeans.

7. Thus there are two representational domainsinvolved in communication, input and output. Andthere are two relevant factors: the actual behavior andgestures of the organisms, and the environment oftheir actions.

22 The obvious distinction being drawn is between the massive neuralnetwork, parallel processing of the brain, and the (brain) images of "con­sciousness. ' ,

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 13: 5 Consciousness: The Organismic Approach

214

8. B processes input from A by the complex si­lent method. The "conscious" state in B, as a result,then entails A's movement, A's gestures, the gazelleand the environment. That very "consciousness," itis concluded here, is the response of B' s organism toA's input.

In other words, from the input processing in B,the "consciousness" of the scene in B becomes theanalogical output to A. It is in a representational formthat fits with A's requirement for adequate communi­cation. This is because it is the analogical recreationin B of what A is doing in the environment, togetherwith B' s actual reaction-thus output.

To lay it out further: If A moves to the right, B' ssilent neural states analyze A's action, and then A'smove to A's right is created as B's "consciousness."For A (implicitly), this is B' s response (i.e., output)to A's move. But in addition if, for hunting success,B also should move in a direction in accord with A'smove, B's silent processing of A's move to A's rightalso causes B' s move. Though this is a move per se,it is also output to A. Thus the input processing lockseach to the other in the hunt program, and the outputsignifying that interlock is the reciprocal "conscious"state (which the other must assume) together with thebehavioral change.

What is explicitly communicated is movement,gesture, and vocalization-what we call the reactiveelement. What is implicitly communicated is the envi­ronment, including the conspecific and gazelle-whatwe call the contextual element.23 Of course the com­monality of context is not precisely the same; it has,as it were, a plasticity within its mutually defined oper­ative locale.

9. Put another way, when you and I are talking(which lions cannot) we do not have to say to eachother constantly: There is a wall, there is a chair, thereis a table, there is a window. Our organisms are al­ready communicating this (nonconceptual) common­ality that we assume by our common image as ourseparate "consciousnesses." When my words (the in­put) are processed by your silent neural states and thenbecome "conscious," you take them as coming fromme (without considering it), for I am represented asyour "consciousness" too. But when they are your"consciousness" :

• Already they represent your (organismic) output to me,for already they have associated the reaction you (organ-

23 Here we are treating of the "directly successful" proper function(Millikan, 1984) of "consciousness." Output continues whether there isa conspecific around or not.

Philip Clapson

ism) have to them-how you hear my tone, and yourgathering response, for example-which you may com­municate.

• The contextual commonality is not outside of us in whichwe dwell; it is what we both project from ourselves as theshared context of our intercommunication.24

Context and reaction as output are the necessaryactualities of the biology of "consciousness." Thisfunctions for nonverbal lions too.

10. Moreover, it is evident that when I do pointto the chair, my "conscious" image of the chair thatI am pointing to is part of my gesture. For the brain,the extended finger and the image to which my organ­ism is directed are a seamlessly linked act. 25 And youtake this to be so, partly explicitly by the extendedfinger, my gaze, my words; and also implicity, for youtoo have the image to which I am pointing which Iassume you have as your reciprocation.

11. In this account, the reason for the integralnature of "consciousness" is evident. For humans,and presumably some other animals, output must becomprehensive enough to be an apposite analogicalrepresentation of "conspecific behavior in the envi­ronment" (see also footnote 23).

12. The brain does not consider the' 'conscious"image it has created and then, from that image, orusing that image as a guide, decide to point, as repre­sentationalism supposes. Why would it need to? It isthis interpretation that manufactures the superfluouscharacter of consciousness. The image has no propertyfor the brain that exceeds its communicative role (i.e.,no intentionality about the world)

13. In the organismic approach, my "conscious­ness" is actually my organism's analogical output toyou. Although my sense about my "consciousness"is that it is functioning as input to me or about me, infact it is my brain's projection of its status about itsrelation to the world, including you. My (misinter­preted) sense of what it is, is part of my organism'soutput.

The Required Shift of Paradigm

What is called' 'consciousness" is physical analogicalsignification.26 Its communicative role is for conspe-

24 Imagination is the brain capacity for the equivalent function withoutphysical copresence.

25 Indeed, considering Libel's findings, the (inaccessible) brain-deci­sion to point subsequently incorporates the image to which the pointingis directed.

26 A comparison can be drawn with the chameleon's skin, but spaceprecludes going further into this.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 14: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

cifics, but cross-species communication also can takeplace thereby. The assumption from representational­ism is that communication just happens somehow (bybrain or mind magic). This is obscure, and still leavesconsciousness mysterious. But, indeed, even if B' sbrain interpreted what is going on with A, and A'sbrain what is going on with B nonconsciously, bothbrains still must be performing that interpretative task.The claim here is that the interpretative task, in termsof output, is the brain activity we refer to as "con­sciousness." For the function of "consciousness" issupra-organismic.

But then what we point to as consciousness is not,in terms of biology, consciousness. The Chambers'dictionary has conscious as: "having the feeling orknowledge of something; aware." But biologically,"consciousness" is not the original location or man­ner of organismic knowledge: It is the articulation, there-representation of the knowledge, causally held inthe silent neural states, that can be communicated.

Indeed, whatever meaning is deemed to be, it re­sults from re-representing neural states for communi­cation to conspecifics. Were we solipsistic organisms,meaning, in the terms we understand it, need neverarise.

A highly misleading philosophers' phrase is thatphenomena are how objects appear to the mind. Theposition here is the contrary. Phenomena are how thebrain projects its status to the world. Without themind-which communicates by transmitting over the(for Descartes) infinite gap between individuals (gen­erating the "Are there other minds?" problem)-onesees that what is transmitted manifests itself by thesame (or similar) "conscious" occurrence in anotherindividual, precisely because they have the same (orsimilar) neural states. 27

If I say "I see a tree" you (organism) do not hearme say "I see a tree" but undergo the neural statesgenerating your "consciousness" as my words "I seea tree." I do not have access to the manufacture ofmy words, nor you of your hearing. But their causalexistence within each of our organisms is compatibleneural states. The spoken and heard words do not as­sure our (infinitely separate) minds of the communica­tion between us. The neural states those words portrayare the causal actuality our organisms then generate(as "consciousness") as the fact of communication.This is the communicative interlock in operation. 28

27 This is not a theory of mind. Not as, for example, is NicholasHumphrey's (1983) social but dualist view of consciousness as a meansof understanding other animals by one's own experience.

28 There is no passive "intake" in consciousness, as the philosophicalexpression "consciousness of" misleadingly implies (cf. Munz, 1993, p.68).

215

Now a riposte to this might be: But what if I'malone? I'm not communicating with anyone, so on thisprinciple my "consciousness" is redundant. But thisis not so. The biological proper function of "con­sciousness," in Millikan's sense, is to be the portrayalof neural status, and this will happen regardless ofwhether anyone else is a participant in the communica­tive process (Millikan, 1984). The riposte, again, pre­supposes our experience is going on "for us," asmind, as soul.

Suppose one is thinking and has, alone, a newidea or realization. The organism, at some later time,may well reuse this idea or realization. But it willnot remember and use that "conscious" state. Silentneural states will repeat and the idea will reoccur (in­cluding possibly the bound' 'fact of past occurrence' ')and may be useful in some communicative context.Even if one is only "using the idea" again in someisolated situation, that the idea occurs does not guidewhat the organism does, but is simply available tobe the communication of whatever the organism isoperating as, those silent neural states. To say (as somedo), that a verbally expressed thought feeds back perse into neural states, has not grasped the mechanisms,for it preserves the notion that mentality is in the brain.Indeed, as "conscious" appearance, no mental stateis ever exactly the same, for the brain's states willnever be the same.

A riposte to the banishment of "mind," and allits terminology, which is so familiar to us, might bethat if one is describing what neural states are doing,and the construct "mind" is such a description, whyis it not adequate?29 There are three responses. (1)Mind concepts were invented before more insightfulorganismic understanding occurred. De factor startingfrom here we would not presuppose the mind, andtherefore owe it no historical allegiance. (2) The braindoes things that have rendered the mind story in thefirst place-and we need to understand what they areexplicitly, which the current mind story obscures. (3)We simply will not grasp an organismic understand­ing, which is biologically of great significance, if weperpetuate a fiction in the midst of it.

Most disturbing is the understanding that writershave reached, from Freud onwards, that our existenceis planned and controlled without any consciousawareness until whatever the brain' 'decides" to makeconscious. But we now conclude that, even when"conscious" awareness arrives, it is not awareness in

29 Many writers might take this view, including such diverse figuresas the philosopher Donald Davidson and cognitive scientist Bernard Baars.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 15: 5 Consciousness: The Organismic Approach

216

the commonly understood sense. For that presumessome fundamental (ontologically primary) link be­tween this very experience as our self and(/to) theworld, time, events, relationships. But the primary linkis in the silent neural states.

So it is that the organismic approach underminesthe soul notion fundamentally. For it concludes thatwhat is going on in the organism does not correlatewith the "for me" or "as me" sense. What is causalis unknowable, and what is knowable functions forthe recipient who is not me-or more exactly, to theorganismic community of which I am a member. (It is,perhaps, Dennett's attempt to save the soul-integrity ofconsciousness that undermines his account.)

The Research Program of the Organismic Approach

Since the approach here is novel, any research pro­gram requires a fresh start in both understanding anddescribing what is going on in the organism, and de­veloping an adequate descriptive vocabulary. Theremay be interest in associating aspects of mentalist ter­minology with the new, but finding the new is theprime aim. For space reasons, it is not possible hereto layout formally the structure and content of sucha program. The following is an indication.

The Brain

As the organ responsible for "what we do," "whatwe think," "how we feel," a number of fundamentalissues about the brain must be pursued. Some of theselie in empirical science, as in: How does the brain turnphysiology into' 'conscious" experience? Some lie indiscovering what the brain does in its control activity,which will be both empirical and according to testablemodels. Models exist: The competitive demons idea,as in Dennett's view of consciousness, as the "win­ning control of the organism"; Freud's metapsycho­logical models; Baars's Work Space Theory ofConsciousness (1996); and the folk psychologicalprinciples of means/end, belief/desire reasoning.

The problem is, to varying degrees, these projectonto the neurophysiology loaded mentalist assump­tions. The task, now, is to be accurate to the physiol­ogy. The rise of the neural network model-which inan as yet crude way perhaps replicates elements ofbrain activity-demonstrates that the physical canachieve the apparently mental without being primedsomehow with mental functionality. Physical network

Philip Clapson

states can identify objects and perform logical opera­tions without working on any kind of problematicsymbolic content of those entities or processes. Thisis all well known.

I can only raise two key issues.1. The death of phosphorescence. The first is a

mistake about what consciousness does, which, de­spite material here being widely available, has notbeen discarded. This is the supposed difference of pro­cessing between the conscious and nonconscious; thatwhen we are conscious "we" are active in a waythat is not the case when nonconscious. For example:"PET studies, published in the 1990s, were stunningfor the sheer dynamism of the change.... When a taskwas novel, wide areas of the brain were lightingup.... But with just a few minutes' practice, the sameresults could be produced with hardly any visible ef­fort at all. A skill that had been learned or exploredin the full limelight of consciousness had becomedownloaded to create a low-level action template"(McCrone, 1999, p. 192).

This description by McCrone of Richard Haier'sexperiment on students undergoing a computer gametest (typically) confuses "conscious" apparentnesswith causal brain activity. It is reasonable to assumethat the brain must modify itself for the task, and tobegin with processing is wide ranging and complex.Once modified, it can be replayed without the modifi­cation process. But this does not mean that "con­sciousness" pours its "full limelight" 30 on thelearning process-whatever that may mean. "Con­sciousness" is the (selected, regulated) portrayal ofwhat the brain is doing; the brain does by neural pro­cesses which are not, nor can be, conscious. WhileMcCrone's glamorizing misdescription is so inscribedin our vocabulary, a grasp both of "consciousness"(the brain's communication mechanism), and the neu­ral causation, will remain elusive.

2. Biological translation categories. But clearly,what causes the organism to act, although describablein neural terms, must also be describable in terms ofbiological categories. For neurophysiology imple­ments biological aims. Freud understood that humanactivity realized biological aims, which themselveswere unavailable to consciousness and generalized,and became specific in the process of realization. Thisis how one object could substitute another in satisfyingsome primary biological goal. With a different intent,but resonant insight, Lakoff (1987) and Johnson(1987) have found within mental activity and its verbal

30 Ryle's word is phosphorescent (1949).

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 16: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

denotation, primary categories of body-world interac­tion. Finding the translation categories of brain physi­010gy into the biological aims, portrayed as, 'consciousness" and carried out in the actions of theorganism, will be a major task.

Now, one might suppose that belief-desire men­talist discourse is such a translation vocabulary. Butbelief (for example) is neither a neural nor biologicalcategory. One might say that belief represents a neuralstate upon the basis of which the organism acts. Simi­larly desire represents a biological category uponwhich the organism attempts to alter its state. But theverbally expressed categorial self-understanding ofbeliefs and desires (unless we are going to work down­wards from apes to thermostats on the principle thatbeliefs and desires are just [bio-]mechanical terms,as some propose) do not illuminate either what theneurophysiology does, nor the biological aims of theorganism. These are simply not available to belief-de­sire vocabulary. Common human experience ex­presses the sense that, despite being able to say whatone is doing, one does not understand it. "I love him,but I don't know why." "I bought that table; I thinkit's perfect-for some reason." The mystery in humanexperience is that what we know and express does notfully explain what is going on with us. But this is nota mystery of the human soul; it is the divide betweenwhat "consciousness" delivers interpreted in men­talist vocabulary, and us as organism.

Phenomenology

There are various aspects to reinterpreting phenome­nology upon biological and neural understanding;some of this has happened. The brain's operation mayaccount for aspects of our experience: Brain states oc­cur for a limited time as does attention; immediatememory can hold up to only seven items on average,which presumably is a physiological constraint; emo­tional states coincide with the presence of certain hor­mones and peptides. These physical-experientialcoincidences (with the brain deficit and imaging analy­ses) indicate that "conscious experiencing" is physi­cally locatable.

More important for the approach here, however,is to analyze "conciousness" on the understandingthat what we are dealing with is a biological interpre­tation of physical states. "Consciousness" portraysthe operation of neurophysiological currency, uponwhich already developed structures and states are in­fluential. It does this in terms of how it finds the world

217

(or its own states as memories, concerns, etc.), andhow it is reacting to that finding. It is a twin poledexpressiveness, as Damasio, and Freud before him,understood (cf. Freud's two [inner, outer] representa­tional surfaces, as discussed by Mark Solms [1997]).

It is remarkable that evolution should have takenthis path. The hungry infant screams with a "con­scious" state of presumably almost complete indeter­minacy. The adult has a fine communicable sense ofhis or her pangs of hunger together with discriminatorypremonitions of cuisine, and an inventory of appro­priate restaurants. "Consciousness" is, as it were, abiological fabric of interpreted neurophysiology thatduring life is woven into communicative possibilities,particularly with the acquisition of language. It worksbecause brains are evolutionarily constructed so. Inthis sense we are not separate individuals. Althoughwe may explore the functional details in lower animalswithout language-where, for example, it begins inevolution-we appreciate that language (with littlecommunicative content) can function because of itseffect on the massive neural structure it activates. 31

This mostly goes unnoticed, being mostly not part of"consciousness" in the communicators.

Many characteristics of the phenomenology,looked at this way, have simply passed by investiga­tion. As in the previous section, only two key issuescan be raised.

1. The biological' 'sense of " "Consciousness"brings with it its own biological function, sense ofClaxton dwells on the sense of free will. But in factthis sense of underlies the whole fabric of "con­scious" experience, indicating not some actuality ofthe person's grasp of the world (a mentalist belief cate­gory), but merely a preparedness of the organism tocontinue to the next moment on this indicator (a bio­logical status or action category).

In the case of blindsight sufferers it is said, forexample, that lack of conscious experience limits theiractions. Under experiment, though they are capable ofguessing what object is in their blind field from a pre­given list with above average success, which impliestheir brains have access to some visual information,they would never act voluntarily in relation to thatobject because they assert they cannot see it. Thisseems to confirm that it is the visual experience itselfthat allows a normal individual to act voluntarily

31 Euan MacPhail (1998), for example, is doubtful that our consciousfabric can be anything like that of other animals despite our sentiments,for only language, he thinks, enables consciousness of self which he deemsprerequisite. But his view of consciousness is not as communication.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 17: 5 Consciousness: The Organismic Approach

218

(Marcel, 1988). But the conclusion is not so estab­lished.

Blindsight sufferers have brain damage that pre­vents their organism from registering objects in a waythat, interpreted from the neurophysiology, will appear"consciously." But this does not mean that it is the, 'conscious" experience that enables voluntary action.It is what, and how the brain registers what then ap­pears as "consciousness" that enables voluntary ac­tion. With "conscious" experience of seeing comesthe sense that what is seen is there. But this sensebelongs to what "consciousness" delivers, not whatthe organism grasps of the object that makes it pre­pared to act. And the sense arises so that, in the com­munication mechanism that "consciousness" is, theexpressed assurance of the object being there is made.If the object is uncertain, through fogginess or obscu­rity, the organism aims to grasp the object and cannot,and what is apparent, being ill-defined, may refuse tobe one thing or another. What is expressed as "con­sciousness" is the obscure and involuntary comingand going of the nonobject, until suddenly the objectappears with a sense of certainty. "Ah yes, it's an X.Definitely an X." For the X is "seen." And one wouldact on it. But this sense of certainty can be shatteredat the next moment because the brain has gone onworking, quite unbeknownst to what is registered as, 'consciousness," until there appears before one notan X but a Y, about which there is again certainty.

There is a clear distinction between what is thereand the biological function "what one is certain of."What normals confirm over blindsighters is thus notthat experiencing establishes a certainty about theworld (the power of consciousness), but that a neuro­physiological trigger that will enable the behavior ofthe organism is portrayed. Which is, of course, exactlywhat one would expect if not supposing "conscious­ness" has a transcendental reality-acquiring capacity(cf. Millikan's attack on meaning rationalism [1984]).

The illusion is, therefore, contra Claxton, not ourexperience, but that our experience is itself causal. Thebiocomputer is not creating an illusion by the senseof certainty of a perception: That would only be thecase if "consciousness" aspired to the causality ofthe silent neural states. It does not. The sense of cer­tainty is (in this case) to convey that the organismwould act on what is apparent in the perception. Beliefmisses this distinction of the biology, as indicated.

2. The death ofmentalist reflexivity. Another fac­tor in mentalist phenomenology that misleads is re­flexivity. For the way the brain presents itsinterpretation gives the sense that, out of our con-

Philip Clapson

scious states, we can and do reflect on, reconsider, orprobe the contents of our own minds. The philoso­phers' introspection. We do not.

If I have forgotten a name and probe the depthsof my memory, I experience: the state of forgotten,the state of realizing that I will have to recall becauseI have forgotten, the state of attempted recall, and avaguer state of trying to "let my mind go blank" sothat the name will emerge. These states are sustained,rather than being in an exclusive sequence. More accu­rately, the states not immediate to the moment willseem to be on a kind of periphery (as vision has afoveated area, with the periphery blurred or unfo­cused). And this tell us something about how, in thebrain, the assembly of neural states, bound as "con­sciousness," present themselves: that there is a time/task-localized-concurrency function (called, to differ­ent purpose in the literature, working memory) whichserves to copresent different material that contextual­izes extended operation. In thinking of a name, I donot forget that it is a name I am trying to think of. Butthis does not mean that each of these, as experienced,causally interacts with the others. They are just theportrayal of the brain's operation, which here is awhole made up of overlapping segments.

A zombie presumably would not need all thiselaborate presentation for, since "consciousness"cannot be its means of communication, it (presum­ably) does not need communicative contextualizationof an extended operation. But a human presents thecontext of its state as well as an individual segmentbecause only thus is it explanatory. Indeed, this verybiological function may become a disadvantage, for"letting the mind go blank" is the attempt, by thebrain, to prevent the context presentation getting inthe way of the recall task itself.

Copresentation of the segments gives the strongimpression, ensconced in our ideas and language, thatit is our experience that is causal in the interrogationof our experience or minds. It seems to indicate, forexample, that realizing I have forgotten the name, I"look inward" for it. But obviously the brain doesnot' 'look inward." It tries to engage the right locationin its search method, which it presents as "lookinginward." Perhaps more than any other one topic, ana­lyzing reflexivity will disabuse us of our mentalist in­terpretation of the nature of "consciousness."

Summary

The aim here has been to identify and outline an un­derstanding of the human organism that can be turned

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 18: 5 Consciousness: The Organismic Approach

Consciousness: The Organismic Approach

from a set of, as yet, unstructured insights into anexplicit research program. Much that is relevant hasnot even entered the discussion.

Humankind is social, but not because individualswish to communicate the lonely depths of the soul, oreven the time of day. Humans, like other organisms,are just physical objects. It is because in human biol­ogy the communicative function, "consciousness,"necessarily causes the organism to share a communi­cable commonality of experiencing. The appearanceof our perceiving, thinking, and feeling is a biologicalfunction to share organismic status reciprocally withour fellows as an evolutionary feature which enhancesour survival and reproductive efficiency, particularlyin the achievement of common action. Otherwise, aswe sense, it is not merely pointless but a burden.

References

Baars, B. J. (1996), In the Theatre of Consciousness: TheWorkspace of the Mind. New York: Oxford UniversityPress.

Beaumont, J. G. (1999), Neuropsychology. In: The Black­well Dictionary ofNeuropsychology, ed. J. G. Beaumont,P. M. Kenealy, & M. J. C. Rogers. Oxford: Blackwell.

Brentano, F. (1874), Psychology from an Empirical Stand­point, tr. A. Rancurello, D. Terrell, & L. McAllister.London: Routledge.

Chalmers, D. J. (1996), The Conscious Mind. Oxford: Ox­ford University Press.

Churchland, P. M. (1981), Eliminative materialism and thepropositional attitudes. In: The Nature of Mind, ed. D.M. Rosenthal. Oxford: Oxford University Press, 1991,pp.601-612.

Clark, A. (1997), Being There: Putting Brain, Body andWorld Together Again. Cambridge, MA: MIT Press.

Claxton, G. (1999), Whodunnit? Unpicking the "seems"of free will. In: The Volitional Brain, ed. B. Libet, A.Freeman, & K. Sutherland. Exeter, U.K.: Academic,pp. 99-114.

Damasio, A. (1994), Descartes' Error. New York: Putnam.--- (1999), The Feeling of What Happens. London:

Heinemann.Dennett, D. C. (1987), The Intentional Stance. Cambridge,

MA: MIT Press.--- (1991), Consciousness Explained. New York: Lit­

tle Brown.--- (1994), Dennett. In: A Companion to the Philosophy

of Mind, ed. S. Guttenplan. Oxford: Blackwell, pp.236-244.

--- (1996), Kinds of Minds. London: Weidenfeld &Nicolson.

D'escartes, R. (1985), The Philosophical Writings of Des­cartes, Vols. 1 & 2, tr. J. Cottingham, R. Stoothoff, &

219

D. Murdoch. Cambridge, U.K.: Cambridge UniversityPress.

Fichte, J. G. (1994), Introductions to the Wissenschaftslehre(1797-1800), tr. D. Breazle. Indianapolis: Hackett.

Freud, S. (1895), Project for a Scientific Psychology. Stan­dard Edition, 1:281-391. London: Hogarth Press, 1966.

Halligan, P., & Oakley, D. (2000), Greatest myth of all.New Scientist, November 18, p. 34.

Hume, D. (1739-1740), A Treatise of Human Understand­ing. London: Fontana.

Humphrey, N. (1983), Consciousness Regained. Oxford:Oxford University Press.

--- (1992), A History of the Mind. London: Chatto &Windus.

·Johnson, M. (1987), The Body in the Mind. Chicago: Uni­versity of Chicago Press.

Kant, I. (1781), Critique ofPure Reason, tr. N. Kemp Smith.London: Macmillan, 1929.

Lakoff, G. (1987), Women, Fire and Dangerous Things.Chicago: University of Chicago Press.

Laplanche, J., & Pontalis, J.-B. (1973), The Language ofPsychoanalysis. London: Hogarth Press.

LeDoux, J. (1998), The Emotional Brain. New York: Si­mon & Schuster.

Libet, B., Curtis, A. G., Wright, E. W., & Pearl, D. K.(1983), Time of conscious intention to act in relationto onset of cerebral activity (readiness potential). Theunconscious initiation of a freely voluntary act. Brain,106:623-642.

MacPhail, E. M. (1998), The Evolution of Consciousness.Oxford: Oxford University Press.

Marcel, A. J. (1988), Phenomenal experience and function­alism. In: Consciousness and Contemporary Science, ed.A. J. Marcel & E. Bisiach. Oxford: Clarendon Press,pp. 121-158.

McCrone, J. (1999), Going Inside. London: Faber & Faber.Millikan, R. G. (1984), Language, Thought and Other Bio­

logical Categories. Cambridge, MA: MIT Press.Munz, P. (1993), Philosophical Darwinism. London:

Routledge.Nisbett, R. E., & Ross, L. (1980), Human Inferences: Strate­

gic Shortcomings ofSocial Judgment. Englewood Cliffs,NJ: Prentice Hall.

Norretranders, T. (1998), The User Illusion, tr. J. Syden­ham. New York: Penguin-Putnam.

Ryle, G. (1949), The Concept of Mind. London: Hutch­inson.

Seager, W. (1999), Theories of Consciousness. NewYork: Routledge.

Searle, J. (1992), The Rediscovery of the Mind. Cambridge,MA: MIT Press.

Schopenhauer, A. (1818), The World as Will and Represen­tation, Vols. 1 & 2. New York: Dover, 1966.

Solms, M. (1997), What is consciousness? J. Amer. Psy­choanal. Assn., 45:681-703.

Sulloway, F. (1983), Freud: Biologist of the Mind. Cam­bridge, MA: Harvard University Press.

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014

Page 19: 5 Consciousness: The Organismic Approach

220

Varela, F. J., Thompson, E., & Rosch, E. (1991), The Em­bodied Mind. Cambridge, MA: MIT Press.

Wall, P. (1999), Pain: The Science of Suffering. London:Weidenfeld & Nicolson.

Wittgenstein, L. (1976), Philosophical Investigations, tr. G.E. M. Anscombe. Oxford: Basil Blackwell.

Philip ClapsonP. O. Box 38225London NW3 5XTUnited Kingdome-mail: [email protected]

Philip Clapson

Dow

nloa

ded

by [

Ade

lphi

Uni

vers

ity]

at 0

0:32

23

Aug

ust 2

014