Quick viewing(Text Mode)

Rethinking Systems Thinking

Rethinking Systems Thinking

\}N

ìo

THE UNIVERSITY OF ADELAIDE

School of Mathematical Sciences

David Matthews B.Sc. (Maths & Comp. Sc.) (Hons)

Thesis submitted for the Degree of Doctor of December 2004 Where, after the meta-narratíves, can legitimacy reside?

Jean-Francois Lyotard

2 I hereby declare that this thesis contains no material which has been accepted for the award of any Degree or Diploma in any University or any other Tertiary Institution and that, to the best of my , this thesis contains no material previously published or written by another person, except where due reference is made.

I also give consent for a copy of this thesis, when deposited in the University Library, to be made available for photocopying and loan.

David Matthews 1 December 2OO4

3 Acknowledgements

'"The intense view of these manifotd imperfections in human reason has so wrought upon me and heated my brain, that t am ready to reject allbelief and reasoning and can look on no opnrcn as more probable or likely than any other ...

I dine, t play a game of backgammon, I converse and am merry with my friends and when after three or four hours amusement I would return to these speculations, they appear so cold and strained and ridicutous that I cannot find it in my heari to enter into them any furfher."

-

All too many students present a list of names in their acknowledgements thanking everyone from their kindergarten teachers to their pet cats. Like an actor who has just received the Palme d'Or, students are tempted to see their doctoral theses as their Opus Magnum, the pinnacle of their life's achievement. As such, they feel compelled to thank anyone and everyone who has had a role in their life up to that point.

I suppose I could also thank my pet cats. However, at some point during the course of this intellectual journey, I ceased to be driven by the idea of gaining a doctorate and began to be drawn by the profound and yet ultimately unanswerable questions posed by the terrain I was exploring. Accordingly, I no longer see this work as my coming of age; the vehicle through which I have passed from student to expert. Rather, this work has awoken in me a yearning for exploring hitherlo unchartered intellectual territories that I suspect never be completely satisfied. lt has created the opposite of an expert - a forever-student. As such, I am very much aware that the journey is just beginning and will spare the reader the usual list of major life influences.

Notwithstanding the above, there have been a few people along the way who have been more than just influences but absolute necessities, without which this work could not have been completed.

To begin with, I would not have had the opportunity to start this journey had it not been for the unequivocal support of the various people who were in some way responsible for me at my

4 place of employment, DSTO, during this time. Thank you Phil, Terry and Jennie for giving me perhaps the most valuable gift of all - time. I can only hope that when I am in your position I will be able to exhibit the same farsightedness and generosity that you have shown me.

After securing support from DSTO, I then had to find someone within academia willing to take on a mathematics graduate who fancied himself as a bit of a polymalh. lt would be understating the point to say that the project I had in mind was counter-cultural and risky. However, whilst already burdened with an excessive student load, my erstwhile mentor and friend, Professor Charles Pearce, accepted the challenge and, by so doing, allowed me to continue my association with the School of Mathematical Sciences at the University of Adelaide. Thankyou Charles for making an exception for me, it would never have been the same without you.

Charles' eclectic interests, broad knowledge base and immense popularity have always conspired against him having a light supervisory workload. As such, it was necessary for me to find a co-supervisor for this work. I will be forever grateful for having Associate Professor Martin Burke of the Systems Engineering and Evaluation Centre at the University of South Australia as my co-supervisor. Martin went far beyond the expectations of a co-supervisor and in many ways this thesis would not exist in its present form without him. Of the many things I learned from Martin, perhaps the most important was to take joy in the intellectual journey. For Martin it was never solely about the final product, but the intellectual passions that the journey awoke. Throughout the journey Martin has been an unfailing source of encouragement for me. At the same time, he has been wise enough to encourage me to keep everything else in balance. The quote from David Hume (above) describes what I have come to think of as the psychosis of the intellectual. Martin's support during this time has helped lowêr this to at least the level of a neurosis and perhaps even squeezed out a few moments of health. Martin, I feel both enriched and undeserving to have had you as a supervisor. Thankyou.

On the same note, the few moments of health and balance that I have been able to enjoy during this time would not have been possible without my family and close friends. I do not have the space to list them all, but they know who they are. One that deserves a special mention, however, is my best friend and my wife, Elizabeth. There is a truism I have heard which states'. "the onty thing more frustrating than writing up a doctorate is living with someone writing up a doctorate". Elizabelh has handled this latest fancy of mine with all of the wisdom, grace and poise that she has done all of the others, whilst continuing to excel in her own career. Thankyou so much for your ongoing support Liz.

A final note; writers are parasites, always leeching ideas from other people. I am particularly grateful to have had the opportunity to leech (re-view and re-contextualise) the ideas of some

5 truly extraordinary thinkers. Many of them have already helped change soc¡ety in profound ways and some of them may yet do so in the future. lt is impossible to acknowledge my debt to all of these thinkers, so I will not even try. However, I would like to dedicate this work to one of them, Charles West Churchman, who, sadly, passed away during the preparation of this thesis. No thinker has had a greater influence on the systems movement over the course of their intellectual life than Churchman. Moreover, it is hoped that his influence will continue to be felt well into the twenty first century, helping to shape both the epistemic basis and ethical motivation of generations of systems theorists to come.

6 "Most students think that writing means writing down ideas, insights, visions. They feel that they must first have something to say before they can put it down on paper. For them writing is little more than recording a pre-existing thought.

But with this approach lrue writing'is impossible. Writing is a process in which we discover what lives in us. The writing itself reveals what is alive. ... The deepest satisfaction of writing is precisety that it opens up new spaces within us of which we were not aware before we stafted to write. To write is to embark on a journey whose final destination we do not know."

- HenriNouwen

Overview

'We find ourselves at the end of one epoch, and on the threshold of entering a new one whose contours, as far as I can see, are not yet fully visible-"

- Gerald Midgley

The above appears in Midgley's (2000) book Sysfemic lnte¡ventrbn. One of the aims of this book is lo "undertake a Íundamental rethínk of systems philosophy". ln regard to this aim, Midgley acknowledges that: "it would be arrogant of me (not to say foolish) to think that t could achieve it in just a few chapters ... Nevertheless, I hope that I can make a reasonable sta¡t so that we can begin to shape a credible alternative to mechanism for the 21't century". Following Midgley, this thesis aims to continue the rethink. Like Midgley, I accept that a reorientation of an entire intellectual is not about to happen through the work of one thinker. However, it is hoped that the ideas embedded within these pages will help keep the conversation that Midgley initiated going, and by so doing, help shape 'Íhe contours" of the systems community in the 21'r century.

7 Like the writings of the thinkers I most admire, this thesis aims to be perspicuous rather than systematicl. lnstead of attempting to construct a monolithic, all encompassing theory on the nature of systems, it attempts to bring into a clearer light the folly of some of the core of Occidental thought since the Enlightenment, and by way of implication, the folly of adopting these presuppositions within the systems communityl.

It offers a reason why many of our so-called 'scientific' attempts at solving difficult, contemporary, socio-cultural, socio-technical and socio-economic problems have been seen to fail. Such problems include environmental problems, problems involving management, strategy and high-level decision making, problems involving the economy and its relationship to complex societal issues such as welfare, education, and health, problems involving the human mind, mental illness, and psychiatry and problems arising when the natural sciences 'come out' (of their laboratory conditions) and attempt to come to terms with the incredibly complex wider world around them.

It suggests that the reasons why we have been unsuccessful in our attempts at finding solutions to some of these contemporary problems is not because our methods need altering (as the early systems theorists would have) or because our experts have made mistakes, lost impartiality or have yet to discover the fundamental theories for these new domains (as the positivists and their allies would have), but because of the impossibility of our aims.

It argues that the traditional aims of and that have characterised scientific and pseudo-scientific inquiry during the modern era are ultimately unachievable and that in order to develop a more realistic -understanding of the natural, analytical and social sciences we should re-visit such basic as knowledge, science and system.

It recommends that the traditional 'problem solving' paradigm of inquiry (which involves such binary oppositions as question/answer, /falsity and subjecVobject) be replaced with a paradigm that seeks to design 'contexts'. Under the context paradigm, it no longer makes sense to ask questions such as which answer is correct (there no such thing as a fundamentally 'correct' context). Similarly, under the context paradigm, '' within one context are always relative to that context and whenever a new context is designed the 'old' truths cease to be truths at all. Therefore, inquiry ceases to be a matter of finding a match between representation and , but a process of contextualisation and re- contextualisation.

1 ,perspicuous' here refers to what Wittgenstein called 'perspicuity': a way of confronting problems and seeking to ,dissolve' them by seeing things in a diiferent light rather than developing some new theory that claims to 'solve' them. t ln this sense, it can be said that this thesis is continuing Flood's (1990a,b,c) project of 'liberating ' from the presuppositions that have, to date, largely kept the movement within the modern ' lt is hoped that the arguments presented herein represent a contribution to (and e)dentsion of) this project.

8 Under this new paradigm, a'system'ceases to be objective and independent of the observer, rather it becomes a 'context' in which the observer understands some aspect of the world. Accordingly, 'systems thinking' ceases to be a process of defining the system/environment and analysing its internal relationships, but a process of contextualising and re-contextualising 'parts' and 'wholes'. Representation is therefore never static; it is a process. Specifically, it is a process of designing contexts (or perspectives): preferably multiple contexts. Similarly, knowledge ceases to be a list of 'accurate representations' or'certified truths' but a process or, perhaps more appropriately, a '' in which the knowers are developing new for making sense of (and ultimately influencing) their phenomena of interest.

The arguments for the above are parasitic on a number of more specific contributions from such thinkers as Kant, Hegel, Nietzsche, James, Saussure, Wittgenstein, Dewey, Kuhn, Feyerabend, Derrida, Foucault, Putnam, Rorty and many others. What follows are some minor extensions to these contributions, some minor contributions of my own, and hopefully, a major re-contextualisation, in which, a new perspective is brought to bear on some of the problems facing the applied sciences. Above all, it is hoped that this thesis unfolds a new strand of thought (embedded within a largely untapped literature) for the systems community, and by so doing, helps spark a discourse in which new understandings of systems and systems thinking may emerge.

The Structure of this Thesis

'"Take the red pill, and we see how far the rabbit hole really goes'"

- Morpheus, The Matrix

This thesis tells four stories

Part One tells the epistemic story of , with its focus on knowledge and its legitimation. lt argues that the traditional narratives of legitimation of the modern era are flawed and that the only appropriate response to such an understanding is to seek an free from such narratives. Moreover, it ends with the suggestion that if systems theory wants to move into line with the intellectual spirit of this postmodern age, then it would need to rid itself of its own narratives of legitimation as well.

Part Two tells the story of , with its focus on methodology and progress towards truth. This story charts the adventures of its great hero, science, who having defeated its great nemesis, superstition, was now advancing towards complete knowledge of the world around us. Such a story stands in direct contradiction to the postmodern incredulity towards and, if valid, could provide the of legitimation that

o modernity sought. However, it concludes by despairing of any all-encompassing attempt to parlition science from non-science or of ensuring that science is progressing closer towards truth. By following the dominant legitimation narrative of our time through to its dissolution, Part Two ensures that the incredulity towards narratives of legitimation that arose out of the epistemic story of postmodernism (Part One) remained intact in the face of the methodological story of philosophy of science (Part Two).

Having overcome scientistic opposition to postmodernism we return, in Part Three, once again to the project of re{hinking systems thinking with respect to postmodern patterns of thought. As such, Part Three tells the story of the systems approach, with its focus on holistic inquiry. lt argues that the early systems theories implicitly adopted many of the presuppositions of the narratives of legitimation of the modern era but that, increasingly, these have been discarded. lt concludes by suggesting that the systems approach discard its need to legitimate itself with respect to any metanarrative and embrace the contextual, contingent and plural nature of the systems that it studies.

Each story is a new contribution in its own right. However, and this is the principal aspiration of this work, it is hoped that a fou¡1h story also emerges: a new and unexpected story that can only be told when themes from the other three stories are interwoven. lt is this fourth story that I hope will create interest from the wider systems community and inspire other potential story-tellers to continue to re-think systems thinking in the light of postmodern epistemology and of science.

Given the above, it is not surprising that the arguments developed in this thesis do not proceed linearly. lndeed, a cursory glance at the structure itself betrays the non-linear nature of the arguments embedded within it. The reasons for proceeding in this manner are numerous.

First, and perhaps most importantly, the objects of this thesis, namely ideas, specifically ideas that I believe the systems community need to come to terms with, do not lend themselves to a linear presentation. Whilst many of the thinkers discussed herein have presented their ideological contributions as grand independent systems of thought, nearly all of them have been subject to re-contextualisation by later thinkers. Thus, overlappings have appeared at the edges and, from these, new schools of thought have been created, deconstructed and thence re-contextualised again and again. The situation is one of constant flux whereby meanings are changed from one context to the next.

ln order to aid both learning and assessment, such a profusion of change is typically presented to philosophy undergraduates as independent wholes. This could quite possibly, colour the way in which they understand the space in which these ideologies are at work (and

10 play) for the rest of their academic lives. However, and this brings me to the second reason for non-linearity, I myself have not had the benefiVhindrance of such an undergraduate schooling, having only moved from a traditional mathematical background into the philosophical interests presented here over the course of this doctorate. Accordingly, the literature I have reviewed in these pages, and hopefully added to, has been synthesised without the benefiVhindrance of traditional accounts of philosophical development provided as par for the course in any undergraduate program of study. Thus, the continual flux of particles that make up the literature on science studies, epistemology, philosophy of science and systems thinking has been exposed without a dominant meta-narrative seeking to impose order on them. Of course, the literature comes complete with its own meta- narratives and these are legion. However, with the benefit of beginning this journey with no organising framework, these totalising are easily seen for what they are: but one of many contextual interpretations.

The structure that this thesis follows, therefore, is obviously a contextual interpretation in itself. The context chosen here reflects the aim of this thesis: to continue the re{hink of systems thinking that Midgley (2000) initiated in Systemic lnte¡vention However, whilst this thesis aims to follow Midgley, it hopes to do so by presenting an entirely distinct of ideas (associated with an entirely distinct body of literature) that, for one reason or another, was not presented in Systemic lnteruention Hopefully, the work will not be ignored, but critiqued, extended or reformulated by others more capable than I and thereby serve to broaden the discourse within the systems community and, perhaps, help form a new community of systems relhinkers.

ln the hope of aiding the above, it seems wise to guide the reader through the arguments presented and attempt to impose on the text some semblance of linearity aimed at helping the reader in gaining an overview of its contributions.

PART 1. KNOWLEDGE As we have already noted, Part One tells the epistemic story of postmodernism, with its focus on knowledge and its legitimation. lt discusses the principle metaphysical and epistemic foundations of the modern understanding of knowledge and presents an alternative understanding - a postmodern episteme.

1.1 and Science This Chapter introduces both the majesty and poverty of metaphysics. lt argues that science is inescapably intertwined with metaphysics and presents four distinct metaphysical theories that science has (or could) be based on. lt concludes that the epistemic and methodological self-understanding of scientific inquiry is totally dependent on the dominant metaphysical ideology of the time'

11 1.2 The Rise and Fall of the Modern Episteme This Chapter discusses the rise and fall of the 'modern' and charts the dominant epistemic presuppositions associated with it. lt argues that the decline of modernity is closely associated with the failure of many of its epistemic presuppositions.

1.3 Towards a Postmodern Episteme This Chapter discusses the critiques of 'modernity' from various quarters over the past century and the implications of these critiques on postmodern science and systems thinking. lt argues that the epistemic presuppositions associated with modernity cannot be used to legitimate modern knowledge claims. What makes the emerging worldview postmodern, however, is not the demise of the narratives of legitimation characteristic of modernity, but the rejection of the whole idea of a narrative of legitimation!

PART 2. SCIENCE As has already been noted, Part Two tells the story of philosophy of science, with its focus on methodology and progress towards truth. lt discusses three common narratives of legitimation of modern science: the narrative of , the narrative of method and the narrative of progress. However, it concludes that none of these narratives can withstand critical interrogation and that in finding reasons why philosophy of science could not live up to its own pretensions, it may just have cleared a path through the of modernity and towards a post-scientistic .

2.1 Scientism and Reductionism This Chapter looks at the legitimacy of using reductionist ideology to construct a basis of legitimation for inquiry. lt presents a new understanding of reductionism and suggests that reductionism leads to in such areas as , theory development and methodology. lt concludes by asking the question: if reductionist ideology is undone, then do we have any basis for partitioning science from non- science?

2.2 Scientism and the Willto Methodology This Chapter looks at the difficulties associated with partitioning science from non- science and argues against traditional distinctions by claiming that the meta- narratives of legitimation that they employ do not withstand serious criticism. ln particular, it argues that science is unable to partition itself from non-science by reference to some process of inquiry that may be termed the 'scientific method'. lt

12 concludes by arguing that traditional scientific accounts of objectivity and certainty are a and that all knowledge is contextual, corrigible and incomplete.

2.3 Scientism and the Quest for Certainty This Chapter continues the critique of the narratives of legitimation of modern science by deconstructing the of truth. lt argues that the 'truth' that modern scientists assume they are progressing towards with each successive advancement may not be all they hoped it would be. lt reviews the state-of{he-art in theories of truth within contemporary and places them in a new context aimed at breaking the impasse between the outdated binary opposition of realism and anti- realism. lt concludes by suggesting that legitimation cannot be secured by an appeal to truth and that what is required is a position of epistemic humility towards the results of inquiry.

PART 3. SYSTEMS As has already been noted, Part Three tells the story of the systems approach, with its focus on holistic inquiry. lt discusses the disenchantment of the early systems thinkers with the mechanistic metaphysics of modern science and presents various attempts at reform (along with the optimism that accompanied these new approaches). The review of these early 'systems theories' concludes that the initial optimism was, by and large, misplaced and that increasingly the systems community has been forced to rethink the hopes and aims of its pioneers.

3.1 Coping with Complexity This Chapter presents four early systems theories and surfaces the modern presuppositions associated with each. Whilst most rejected mechanism (in favour of a kind of metaphysics of wholes), all retained a commitment to , monism and truth. As such, it was argued that, despite the grand claims, the early systems approaches served only to inject a fresh new into many of the presuppositions of the modern worldview. The systems revolution, it seemed, was modernity's last stand!

3.2 Coping with Subjectivity This Chapter looks at the use of systems thinking for aiding managerial decision- making. lt argues that the retention of the modern scientific pretension of objectivity (embodied in the emphasis on subject-object dualism) and positivist understanding of method (embodied in the emphasis on mathematisation) has had disastrous consequences for both analysts and managers alike. lt concludes by reviewing some of the proposed 'solutions' to the problems facing the management sciences with a

13 special emphasis on the work of Charles West Churchman, Russell Ackoff and Peter Checkland.

3.3 Coping with Power This Chapter looks at the use of systems thinking for developing participative planning methodologies. ln padicular, it looks at the appropriation of Habermas' 'critical theories'to problems associated with strategic planning and social systems design. lt reviews the intellectual origins of this 'critical systems' movement by tracing its roots back to the emancipatory aims of the neo-Marxists of the . lt concludes by suggesting that a characteristically postmodern understanding of systems would seek to emphasise the neo-Kantian 'critical thinking' component of critical systems thinking whilst downplaying the neo-Marxist '' component.

PART 4: SUMMARY Finally, Part Four provides an overview of the journey thus far taken and attempts to suggest the outlines of a fourth story that it argues also emerges. This fourth story is the story of what a postmodern understanding of the nature of systemic inquiry might look like and can only be told when themes from the other three stories are interwoven. Such an understanding would no longer attempt to legitimate itself with respect to the speculative narratives that systems theory had thus far employed. Moreover, due to its rejection of all narratives of legitimation, it would remain localised to specific problems and issues and not seek to unearth universal categories. As such, postmodern systems theorists would be freed to worry less about systems theory itself and more about deconstructing the systemic boundary judgements that have led to so much socially-countenanced and injustice in the world around us.

14 1.1 METAPHYSICS AND THEA PRIORINATURE OFSCIENTIFIC KNO}VLEDGE t9

1.1.1 The Metaphysics of Forms and Formistic Science...... '."..'..'.'....20

1 .1 .2 The Metaphysics of Mechanisms and Mechanistic Science 25

1.1 .3 The Metaphysics of Wholes and Holistic Science ...... '....'..'.-... 29

1 .1 .4 The Metaphysics of Contexts and Contextual Science 32

I .1 .5 Systems Theory and The Metaphysical Presuppositions of the Age. 34

I.2 THB RIse AND FALL OF THE MODERN EPISTEME ....'...... 39 1.2.1 The Origins of the Modern Episteme: The Greek Period 40

1.2.2 The Rise of the Modern Episteme: The Renaissance 41

1.2.3 The Pinnacle of the Modern Episteme: The Age of Reason 45

1.2.4 The Decline of the Modern Episteme: Kant and the Critique of Subject-Object Dualism.... 50

I .2.5 The Fall of the Modern Episteme: Nietzsche and the Rise of Perspectivism 56

1.3 To\ryARDS A POSTMODERN EPISTEME 63

1.3.1 Postrnodernism Everywhere: The Many Faces of Postmodernity'-...... '...... 63 1.3.2 Through the Looking Glass: Dissolving Modernist Epistemology.'..'-...... '.'...'.....'...' 67 1.3.3 Attacking the Foundations: Postmodernism and European Post-\tructuraLism...... 71

1.3.4 For the Edification of Us All: Postmodemism and American 82 1.3.5 Towards a Postmodern View of Science and Systems...... 96

2.I ScmNrTsM AND REDUCTIONISM ...... r04 2.1.1 From Reduction to Reductionism. 104 2.1.2 Reductio ad (lnum: Reductionism and the Unification of the Sciences 108

2.I .3 Dissectio Naturae: Reductionism and The Analytic Method 114

2.1.4 Reductio Methodologicae: Reductionism and the Scientific Method'..."'. 1r9 2.1.5 Reductio ad Absurdum: The lllegitimacy of Reductionism and Monism.-...... '...'- 121

2.2 SCENTTSM AND THE WILLTo MersooolocY...... t24 2.2.1 Francis Bacon and The Verificationist Account of Science.. 124 2.2.2 The and The Positivist Account of Science 128 2.2.3 and The Falsificationist Account of Science.... 134 2.2.4 and The Socio-Historical Account of Science 141 2.2.5 Conclusions: The lllegitimacy of the Honorifi.c'Science' 149

15 2.3 SCENTISM AND THE QUEST FOR CERTAINTY 151 2.3.1 Introduction: The Science Wars and The New Narrative of Legitimation...-'.-.-.-...... 151 2.3.2 The Ontological Theories of Truth...... 152

2.3.3 The Anthropological Theories of Truth.. 163

2.3.4 Science and Truth: Ontological and Anthropological Positions 177 2.3.5 Conclusions: The Legitimacy and lllegitimacy of Truth. /88

3.1 CopNG wrrH CoMpI-Exlry: Sysrevrs THTNKTNG FoR STUDYINc EMERGENTPHENoMENA ...... 196 3.1.1 The Concept of and the Emergence of Systems Theory' 196

3. I .2 and the Study of Indeterminable Emergent Properties 202 3.1.3 , Synergetics and the Study of Unpredictable Emergent Propeúies...... 207 3.1.4 Autopoiesis and the Study of Qualitatively Distinct Emergent Properties 218 3.1.5 The Modern Presuppositions of the Earþ Systems Approaches 222

3.2 Copn¡c wrrH SuBJECTrvrry: Sysrrus THrNrNc FoR MANAcERIAL DEcISIoN MAKING ...... 221 3.2.1 The Trials and Tribulations of Operations Research. 227

3.2.2 Axempts at Reþnn: Systems Analysis and Policy Analysis 238

3.2.3 A Kuhnian Crisis in the Management Sciences: Hard and Sort Approaches....'.-..-.-...... '.. 261 3.2.4 Towards a New Paradigm (I ): The Enormous Impact of Charles West Churchman...... 269 3.2.5 Towards a New Paradigm (2): The Origins and Evolution of Soft Systems Methodology . 277 3.3 Copl.¡c wITH Pov/ER: Svsr'¡lr¡s THnIKINc FoR PARTIcIPATTvE PLANNING 281

3.3.1 The Frankfurt School, Critical Theory and Jurgen Habermas...... '..'.'.'- 287 3.3.2 Flood and Jackson's Appropriation of Habermas inTotal Systems Intervention...... 314 3.3.3 Werner Ulrich's Appropriation of Habermas in Critical Systems Heuristics. 336 3.3.4 What is Critical Systems Thinking? ...... '... 369 3.3.5 Towards a Postmodern Understanding of the Nature of Systemic Inquiry 385

PART 5: REFERENCES 396

16 Part 1: Knowledge

"An illusion can never be destroyed directly, and only by indirect means can it gradually be removed ... a direct attack only strengthens a person in his illusion. There is nothing that requires such gentle handling as an illusion, if one wishes to dispel it."

- Soren Kierkegaard

Where to begin? Much of what follows in the pages of this volume will attempt to dismantle certain metaphysical and epistemic presuppositions of the modern intellectual (including 3. scientists, analysts, engineers, economists, sociologists, historians etc) However, such an aim has the implicit assumption that the reader already agrees that such presuppositions are present in any formulation of expert disciplinary knowledge. Whilst this idea is not new in philosophical circles, it would be unreasonable to assume that all readers of this present work will accept it without any further discussion. Certainly, many would argue that modern science, for one, has replaced the empty metaphysical speculation of the pre-modern world and provided us with knowledge that is empirical, verifiable, and for want of a better word - true. We are told from an early age that science is neutral, value-free and objective. Moreover, we live in a world where science has provided us with various ways and means to enrich our standard of living. As the old adage would have it: '1he proof fof the truth of sciencel is in the pudding [of its successl". Given this background, perhaps a good place to start is with a discussion of how science has always and will always coexist with metaphysics.

Within the discipline known as 'philosophy of science', it is now widely understood that science cannot progress without powerful a prioriassumptions about the world it is trying to investigate (Harris, 1965). For example, modern science has generally taken as a given the idea that the world around us is fully law-governed and thereby fully intelligible. That is, t The term 'metaphysics' does not have a precise or agreed upon . The first reference to the term is found in Andronicus of Rhodes arrangement of 's writings. According to Andron¡cus, metaphysics referred to the writings that came after Aristotle's writings on physics. Following Andronicus, metaphysics has been thought of as the study of reality in its most basic form. lt attempts to tell us what anything must be like in order for it to be at all. As Heidegger would say, metaphysics is the study of being. Often metaphysics is used synonymously with ontology (Greek bn'meaning 'being' and English 'ology' meaning 'study of). Both attempt to concern themselves with the rational study of what lies beyond the wofd of 'appearances' (which is the focus of natural science). ln this sense, metaphysics may be thought of as pre- or proto-science. Epistemology, on the other hand, is the study of knowledge and how it is obtained. Derived from the Greek 'episteme'(knowledge) and English b/ogy'(study of), epistemology, studies reason (i.e. ), observation (i.e. perception), belief, language and methodology. Despite textbook caricatures, however, it is often difficult to separate metaphysics and ep¡stemology, as both presuppose each other. Obviously, knowledge requires a 'known' and, as such, epistemic questions cannot be considered in isolation from questions concerning what exists. Nor, on the other hand, can metaphysics be studied without consideration of how it is possible to come to know what exists.

17 modern science assumes that the world 'functions' something like a machine or an algorithm. This assumption has come to be known as mechanism. Although many assume that mechanism is one of the discoveries of modern science, philosophers have maintained for a long time that it is actually one of its implicit assumptions. However, mechanism is only one of a possible multitude oÍ a priori metaphysical assumptions and the aims, methods and results of 'scientific' inquiry may look very different if an alternative metaphysical was taken as our starting point. Accordingly, the following Chapter presents some of the alternatives to mechanism and explores the consequences of beginning our intellectual endeavours with a different set of a priorimelaphysical presuppositions.

18 1.1 Metaphysics and the A Priori Nature of Scientific Knowledge

"Man is by nature metaphysical and proud. He has gone so far as to think that the idealistic creations of his mind, which correspond to his feelings, also represent reality."

- Claude Bernard ln his book World Hypotheses, Stephen Pepper (1942) argued that the metaphysical framework within which scientific evidence is organised and structured at any given point in history is grounded in the dominant ' of organisation' prevailing at the time. The first part of his book dispensed with the naïve cognitive attitudes of extreme uncertainty (which Pepper labels utter scepticism) and extreme certainty (which Pepper labels dogmatism)4. Having discarded dogmatism and utter scepticism, Pepper argued that the middle path between these extremes is populated with different 'world hypotheses', which are in turn, based upon different'root '.

The root-metaphor process is described as follows

"A man desiring to understand the world looks about for a clue to its comprehension. He pitches upon some area of commonsense fact and tries if he cannot understand other areas in terms of this one. This original area becomes then his basic analogy or root metaphor. He describes as best he can the characteristics of this area, or, if you will, discriminates its structure. A list of its structural characteristics becomes his basic concepts of explanation and . We call them a set of categories. ln terms of these categories he proceeds to study all other areas of fact..."(Pepper, 1942).

As shall be discussed later, by conceiving of scientific knowledge in such a way, Pepper implicitly accepts the Kantian (1781; 1783; 1786) position that the metaphysical foundations of natural science are implicated in the construction of a scientific picture of the worlds.

Apart from the description of the root-metaphor process, Pepper's other major contribution is found in the articulation of four world hypotheses: formism; mechanism; and organicism. Thus, Pepper not only implies that science is entwined with metaphysics but that different communities have different metaphysical bases for their sciences. The aims,

o lt can be argued that these two positions are essentially the same (an utter sceptic being a dogmatist with respect to non-belief) and that a general critique on dogmatism suffices to defeat both positions. The details of Pepper's critique are beyond the scope of this work, however, some of the illustrations Pepper uses to demonstrate scientific dogmatism will be raised in subsequent sections. These include the presumed 'truth' of scientific literature (neo- ), the presumed 'truth' of statements by subject matter experts (neo-guruism) and the presumed 'truth'of scientific and empirical observation (neo-). " Moreover, by so doing, he blurs the traditional distinction between metaphysics and epistemology.

19 methods and outcomes of scientific practice differ according to the metaphysical persuasion of the discipline or group. ThaÌ is, scientific knowledge is necessarily sociological.

But we get ahead of ourselves. Such conclusions will indeed be reached, but only after significant argument, drawing on multiple sources. For now, it is sufficient to describe and extend Pepper's four world hypotheses. These are by no means the only four metaphysical theories 'on the market'. Nor should they be considered as completely unrelated systems of thought. Neither should Pepper's description and/or terminology be considered as the final word on the matter. However, notwithstanding such possible objections, a description of these 'world hypotheses' is called for. Moreover, a discussion on how different sciences have (often unwittingly) employed one or other of these hypotheses (or something like them) over the course of scientific history is needed to give credence to the claim (above) that 'ïhe aims, methods and outcomes of scientific practice differ according to the metaphysical persuasion of the scientist". lt is hoped that at the end of this discussion, the reader will be, if not convinced, at least open to the suggestion that science is inescapably intertwined with metaphysical ideology, and therefore well-positioned to critically examine the ideas that make up the rest of this thesis.

1.1.1 The Metaphysics of Forms and Formistic Science

"lt is inappropriate to ask of an object: 'what is the natural kind to which it belongs?"'

- John Dupre

According to Pepper (1942), the root metaphor of formism is similarity. The world is full of things that seem to be alike: leaves in trees, species of birds, types of stones, etc. This observation leads to perhaps the most powerful cognitive instrument of formism, namely the concept of 'classes' (or taxa). A is a collection of objects, which share one or more characteristics. Furthermore, classes are themselves organised into 'classifications' (or taxonomies). Normally, a classification proceeds from the generalto the specific. That is, from the classes with the smallest number of characteristics (and largest number of objects) to those with additional characteristics (and hence fewer objects). Such a classification is known as an inheritance hierarchy, as those classes at lower levels in the classification inherit all of the characteristics of their 'parent' class in addition to the extra characteristics that distinguish them from their parent6.

It is often supposed that classification can reveal an orderly, unique and hierarchical arrangement of the things that exist. Furthermore, it is assumed that the correct classification

u Matthews et al (2OOO) present a set of concepts for military systems engineering that are based upon a formistic inheritance hierarchy.

20 of the things in the world is determined by the natural order of these things and not imposed on the world by the taxonomert. Thus, formistic science typically claims a kind of absolute legitimation for the division of things into their various 'classes', or 'kinds'. According to the formist, the act of discovering the 'correct' classification tells us much (perhaps everything there is to know) about the object. Thus, classification becomes the goal as well as the method of inquiry and formistic science becomes characterised by the search for nature's'true' categories.

Perhaps, the most well known output of a scientific discipline motivated by formistic presuppositions is the taxonomical structure of organisms. Within biological systematics, an organism is classified by assigning it to a hierarchical series of taxa where 'species' are considered the basic unit of the taxonomy. Rules are used for assigning individuals to species and thence to a position in the overall structure. According to the formist, nature is ordered unambiguously, and therefore, there exists a single 'true' taxonomy, deviation from which would result in scientific erroru. The taxonomic structure of organisms, therefore, is considered representative of the order inherent in the natural world. Table 1 provides an example of one thread (one inheritance hierarchy) of this structure.

Class Characteristics lnstantiations Organism . Autopoietic (self producing) All living objects Kingdom: " Autopoietic All animals Animalia . Multicellular organism requiring food Phylum: . Autopoietic Some 43,000 Chordata . MultìcelIr-¡lar <¡rganisnr requiring food species . Have (at some stage) a notochord, pharyngeal gill slits, hollow nerve chord and tail Subphylum: . Autopoietlc Some 42,000 Vertebrata o Multicellular organÌsm requiring food species . Have (at sorne stage) a itotochord, pharyngeal gÍll slits, hollow nerue chord and tail . Spinal cord & skull Class: " Autopoiotic Some 4,500 species Mammalia . Multicellular organisrn requìríng food " Have (al sonte stage) a notochord, pharynç¡eal gìll slíts, holiov'r nerve chord and fail . Spinal cord & skull . Suckling, four-limbed, skin, hair, red-blood cells, homeothermic etc...

7 For example, the classification of people by hair colour into blonde, brunette, red etc does not reveal anything about the intrinsic nature of people as the categories are obviously artificial. However, according to the formist, when the true categories are found they reveal knowledge of the objects categorised because these categories are not artificial but a pad of the order of things. 8 John Wyndham's (1955) science fiction novel The Chrysalids, which centres on a post nuclear world in which things that mutate from the 'true forms' are considered ungodly and destroyed, is an interesting example of the potential moral implications associated with a metaphysical commitment to formism.

21 Order: . Autopoiêtic Prosimians, Pr¡mates . MultÌcellL¡lar organisnì requiring food monkeys, apes, . Have (at some stage) a notochord, pharyngeaf gill slìts, hollow humans nerve chord and tail . Spinal cord & skuil . SucklÍng, four-limbed, skìn, hair, red-blood cells, hotnoetlrermic

etc.. . . Fingers, nails etc... Family: " Autc¡poietic Apes & humans Hominidae . Multìcellular organisnr requiring food . Have (at some stage) a notochord, pharyngeal gill slils, hollow nerve chord and tail . Spinal cord & skull . Suckling, four-linrbed, skin, hair, red-blood cefls, homeothermic

etc. . . Filrgers, l'¡aìls etc... . Flatface, colourvision, upright, bipedal etc... Genus: o AL¡topoietic Humans. Homo . Multícellular organism requiring food Some extinct species . Have (at sorne stage) a notochord, pharyrrgeal gill sliis, hollow have been classified nerve clrord and tail in this genus (eg

. Spinal cord & skLtll Homo habilis, Homo o Sucklirrg, lour-1imbecl. skin. hair, red-blood cells, hontoethermìc, erectus, Homo saptens eJc. . . Firrgers, nails etc... neandetihalensis) . Flatface. colourvisiorr, upriglit, bipedal etc... . Large brain, speech, long childhood etc... Species: . Autopoietic \ Humans Homo sapiens . Mullicellular orga;risrn requirÍng food saprens . Have (al some stage) a notochord, pharyngeal gill slits, hollow nerve chord arrd tail ' Spinal cord & skull r Suckling, tour-limbed, skin, fiair, red-blood cells, homeothen¡ic etc... . Fingers, nails etc...

. Flat face, colou¡ vision, upright. bipedal etc .. . Large braìn, speech, long childhood e1c... . Prominent chin, high forehead, less hair lndividual: . Autopoielic The human writing David Matthews . MLrlticellLrlar organism requiring fooci this thesis, David . Havê (ât some stage) a notochord, pharyngeal gilf sjìts, hollow nerve chord and taiì ' Spir'ìal cord & skull . Suckling, fourlimbed, skin, hair, red-blood cells, lrorneothen'nic etc... . Fingers, naiis etc... ' Flatface, colourvísíon, Lipright, bipeclal etc... " Large braìn, speech. lorrg r;lriltihood (s

22 chil

Table 1: An lnheritance Hierarchy for Humans

According to the formist, what characterises an object is that it possesses cenain 'essential' properties that place the object within its appropriate'natural kind'(Hull, 1965; 1976). However, it has been argued by many that nature does not provide us with a unique set of 'kinds' of objects in the world and therefote "it is inappropriate to ask of an object what is the natural kind to which it belongs" (Dupre, 1981; 1993). lndeed, the question of categorisation can only be answered in relation to the goals of the taxonomer. That is, taxonomies are not inventions of nature but inventions of human and, as such, there must be countless legitimate ways to classify ob¡ects in the world and these, more often than not, would cross-classify one another in complex ways.

As an example of cross-classification, consider the lilies of the field (Liliaceae). The lonely lily belongs to the genus Eremocrinum, the avalanche lily lo Erythronium, the adobe lily to Fritillaria, the desert lily to Hesperocallis and the white globe lily lo Calochotlus (Dupre, 1993). However, Calochorlus is also the genus of various tulips (e.9. the star tulip and the mariposa tulip). Thus, from the point of view of current biological doctrine, the terms lily and tulip appear, at best confusing, and at worst meaningless. Similarly, consíder the birds of the air. Hawks cross several families within the order Falconiformes. However, to extend the term 'hawk'to the entire order would surely be a debasement of the 'common-sense'classification, as most would argue that a vulture is not a hawk. Such cross-classification between 'common' and 'scientific'taxonomic structures is a regular occurrence. This has led to much confusion on the applicability of such common names as lily, hawk, moth, butterfly and fish (Dupre,

1 993)e.

At this point it is tempting to deride the 'common'classification as 'pre-scientific' and suggest that such terms as 'lily', 'tulip', 'moth', 'butterfly' and 'fish' are (scientifically) meaningless. lndeed, perhaps all we are discovering in cross-classification is the confusion of 'pre-scientific' discourse. Notwithstanding the appeal of this position, care must be taken before dismissing the'common'structure in favour of the 'scientific' one. The is not necessarily one of

s An interesting example here is the well-known exclusion of whales (order Cetacea) from the 'common' category fish. As Dupre (1993) highlights, this example is by no means clear-cut. The source of Dupre's doubt is that the 'common' category 'fish' lacks a tidy taxonomic correlate within the 'scientific' structure (another example of cross- classification). For example, there does not seem to be any compelling reason for combining the three Chordate classes (Chrondrichthyes, Osteichthyes and Agnatha) into a single category 'fish' and excluding other classes such as Mammalia. Perhaps a more satisfactory 'common' classification for fish (and probably more likely to be socio- historically revealing) is simply aquat¡c vedebrate. lf so, then the well-known exclusion of whales from the category 'fish' is simply wrong!

23 correct and incorrect classification, but more likely one of different classification criteria, neither of which can be easily dismissed as 'incorrect'. To emphasise the point, it is worth noting that in recent years the traditional scientific 'morphological' classification has been increasingly superseded by a classification employing detailed phylogenetic information together with currently accepted evolutionary conjecture. ln particular, one influential school of thought, 'cladism', insists that every taxonomic distinction should reflect an evolutionary of lineage bifurcation and hence every taxon should include all of the evolutionary descendants (and none other) on one side of such a bifurcation (Dupre, 1993). Thus, forthe cladist, the class Reptilia of the more traditional scientific taxonomies is meaningless because it does not include Aves (birds), which according to current evolutionary speculation are thought to be descendants of primitive reptiles.

The examples of cross-classification between 'common', 'morphological' and 'phylogenetic' taxonomies presented above constitute only a small selection from potentially tens of thousands of similar cases. Consequently, it seems clear that classification of the world into supposed 'natural kinds' is wholly dependent on the classification criteria employed and these, in turn, are largely chosen for anthropocentric reasons such as the aims and beliefs of the classif ierto. Furthermore, it seems that rather than being a rarc phenomenon, differences in the aims and beliefs of classifiers are quite common and account for different classification criteria within different disciplines. For example, within the field of systematics the most important distinguisher between plants such as Grevilleas, Banksias, Brachycombe, Eucalypts and Conifers is the mode of development of their seeds. Whereas the first four belong to the Angiosperm group (whose seeds are developed in an ovary-like 'vessel'), the Conifers belong to the Gymnosperm group (which have'naked'seeds). However, outside the domain of systematics such a division seems to have little to no use. ln particular, other fields of , such as , would find the most important distinguisher to be their roles in the overall ecosystem (e.g. emergent, canopy, understory or ground-cover)l1. Thus, different disciplines have come to use different classifying criteria for the same obiects.

to Even the basic taxonomic unit, the species, is not as orderly structured as first thought. There are a number of different accounts e)dant as to what constitutes membership of a species (Dupre, 1993). lndeed, evolutionary arguments have been used to deny that species are a 'natural kind' at all, claiming that individuals within a species vary whilst the concept of a 'natural kind' is a static abstraction. Thereby calling into question the ontological status of species in the first place. 11 Another example of the purpose of the classifier influencing the resultant classification is the distinction between garlic and onion. Whilst current 'scientific' taxonomy makes no distinction, it would be a severe culinary faux pas nol to do so at the dinner table. By the way, the onion (or garlic) plant is a species within the lamily Liliaceae, which is commonly thought to correspond to the 'common' class 'lily'. Whilst few would argue that an onion is a garlic, I suspect that none would contend that both were actually liltes.

24 1.1.2The Metaphysics of Mechanisms and Mechanistic Science

"Nature and nature's laws lay hid in night: God said,'Let Newton be!'and allwas light."

- Alexander Pope

Whereas formism assumes that the world consists of pre-determined categories mechanism assumes that it is governed by pre-determined laws. Accordingly, Pepper (1942) proposes that the root metaphor of mechanism is the machine.

One of the enduring contributions of the 'systems' revolution of the mid twentieth century is the observation that mechanistic presuppositions nearly always lead to reductionist attitudes towards scientific method. For many, the only way to understand a machine is to understand its component parts. Therefore, the machine must be mentally (and sometimes physically) decomposed into its parts (which, of course, could be further decomposed ad infinitum) and then understood on the basis of the behaviour of these parts and the manner in which they interact. As shall be discussed later (Chapter 2.1), the'mental decomposition' described above is a well-known epistemic operation known as a 'reduction'. Whilst various kinds of reductions have been around for millennia, under the mechanistic world hypothesis they attain a singular pre-eminence. According to the mechanist, the only way that the world can be successfully comprehended is through conducting appropriate reductions. lf the world is a giant machine, then the behaviour of the whole is necessarily determined by the behaviour of its parts12. ln characterising mechanistic science, Pepper (1942) distinguishes between discrete mechanism and consolidated mechanism. The assumption of discrete mechanism is that many of the structural features of nature are 'externally related'. Thus space is distinct from time, the primary qualities are distinct from the field of location and each atom is distinct from every other atom. The idea being that time can be truly described without any reference to space; a point or locus without any reference to another point; the shape, texture and mass of an atom, without any reference to another atom and a law of nature, without any reference to the atoms that obey it. Pepper highlights three primary characteristics of discrete mechanism, namely:

The Field of Location: Whatever can be located is real and is real by of a location in space (usually considered as an infinite three dimensional manifold).

25 The Primary Qualities: Traditional primary qualities include size, shape, motion, solidity, mass, charge, etc.

The Laws Governing the Primary Qualities: A mechanism without any laws is an empty abstraction that has lost contact with its root metaphor. Accordingly, a complete and rigid determination is common and is one of the principal attractions of the theory. ln contrast to discrete mechanism, consolidated mechanism is more likely to consider nature as a single machine, or a single field, governed by a single law. Thus the gravitational field has 'collapsed' into the spatio{emporal field and there have been serious (though unsuccessful attempts) by mechanists to amalgamate the electromagnetic laws into this single field as well13.

Whereas formistic science is characterised by the search for nature's 'true categories', mechanistic science is characterised by the search for nature's 'true laws' (mathematically expressible as 'governing equations'). Here, Newtonian mechanics becomes the paradigmatic example of a successful science, one that has discovered the 'natural laws' governing the objects within its domain.

Because of the focus on discovering the laws of nature operative within the various scientific disciplines, mechanism (a metaphysical position) is often confused with objectivist accounts of knowledge (epistemological positions). lndeed, even Pepper (1942) confused the two by claiming that: "mechanism assumes that our knowledge of the world (and language we use to frame this knowledge) reflects reality more or less accurately". So, it seems, did Midgley (2000) when he claimed lhal: "mechanistic science is characterised by the use of methods for structuring reliable obseruations to build so-called 'objective' knowledge about the world". Both of these statements confuse a metaphysical position about how the world is with an epistemological position about how we come to know the worldla. The fact that the reign of mechanism has coincided with the reign of objectivist and 'pseudo-objectivist' accounts of knowledge does not mean that it has a privileged relationship with objectivity. ln fact, there is no reason to suggest that a mechanistic metaphysics could not co-exist with an account of knowledge, which claims that objectivity is impossible. ln Midgley's (2000) words, mechanism

12 The 'absolutisation'of the reduction operation implied by mechanism is what this thesis calls 'reductionism'. Thus, within these pages, reductionism ceases to be a method of scientific investigation but an ideology of mechanistic sctence. 13 According to consolidated mechanists, the four fundamental forces of discrete mechanism (gravity, electromagnetism and the strong and weak nuclear forces), as well as all matter, are simply different manifestations of a single force, field or essence. As such, the'holy grail' of consolidated mechanism is the search for a unified field theory or theory of everything (TOE) that could reconcile the seemingly incompatible forces of nature ¡nto a single set of equations (Barrow, 1991). The most well known consolidated mechanist was, perhaps, Albert Einstein, who spent most of his later career attempting to reconcile the gravitational force of 'General Relativity' with the electromagnetic force of 'Quantum Theory'. Current research focuses on the attempt to use superstring theory to provide the unified field sought by consolidated mechanists (Davies & Brown, 1998). to Whilst both mechanism and objectivist accounts of knowledge will be critiqued in this thesis they cannot be dismissed with a single argument that wraps the two posit¡ons up together and dismisses them both. Accordingly, they will be dealt with separately.

26 is simply "the world of clockwork toys". However, upon acceptance of this world, a separate

(epistemic) issue arises. Namely, how can we f igure out how lhese "clockwork toys"work?

The epistemic question posed above leads to another common confusion: the conflation of '' with 'pred¡ctab¡lity'. A consequence of the mechanistic worldview is that once the machine is started, its behaviour is strictly determined (by the causal structure of the machine). Thus, mechanism has a privileged relationship with determinism. According to the assumptions implicit within the mechanistic hypothesis, there exists an omnipresent and wholly regular causal nexus. Everything that happens is therefore strictly determined by antecedent circumstances. lndeed, the logical consequences of this position is that everything that happens is necessitated by the way the world began - determinism being an all or nothing affair15. However, whilst mechanism implies determinism, it has no direct relationship to predictability (for the same reasons that it has no direct relationship to ). Thus, the argument that chaos theory has somehow undermined mechanism is simply mistakenl6.

More central to undermining mechanism is the sheer unbelievability of the global determinism that it implies. Even probabilistic accounts usually specify a strictly determinate range of future statesl7. This has led to speculation that is a strictly human affair (or should I say Humean affair). The first to suggest such a conclusion was David Hume (1741) in his Treatise of . Hume's contribution will be discussed in detail (Section 2.3.3),

ls Most would accept'local determinism'the position that every'effect' has a 'cause'. However, when extrapolated to its logical conclusion (global determinism), many would begin to feel uncomfortable. Moreover, when applied to the human mind, even local determ¡nism is almost completely abandoned, leading to thê curious anthropocentric position that local determinism applies to everything except some aspects of human behaviour. 18 As shall be discussed later (Section 3.1.3), chaos theory is primarily a discovery of mathematics, not natural science. lt applies to sets of equations, not necessarily the world. Whether the patterns exhibited by these equat¡ons (such as eldreme long-term sensitivity to initial conditions) also apply to the world is a separate issue (one that w¡ll be discussed later). For now, it is sufficient to suggest that even assuming that the patterns exhibited by chaos theory do 'represent' the behaviour of cedain phenomena well, what this implies is that these phenomena are beyond the reach of long-term prediction. lt does not suggest that there are no 'mechanisms' relaling the parameters of interest, or that if, somehow, it were possible to obtain completely accurate measurements of initial conditions then predictability would still be unachievable. Perhaps the easiest way to explain the difference between determinism and predictability is by reference to Laplace's demon, a being who is able to measure the states of all of the fundamental particles in the universe and thereby infer their states at any future point in time. Laplace's demon is able to predict future states because they are fully determined. However, for us 'non-demons', fa¡lure of prediction does not necessar¡ly point to an indeterminate universe - only a conflation of predictability with determinism would suggest so. However, such conflations are common in the systems literature (see Checkland,'1981b orJackson,2000a). Even philosophers of science such as Karl Popper (1982) have been known to make this m¡stake and be subsequently critic¡sed for it (see Earman, 1986). tt Even if our representations become probabilistic, mechanism is not necessarily undermined. For example, some advocate a metaphysical determin¡sm together with an epistemic probabilism. Thus, the causal relations remain but our representations of them are shrouded in uncertainty. Others suggest that the causal relations themselves are probabilistic (the famous God playing dice argument) hence the 'machine' becomes a kind of 'slot machine' or'poker machine' and the governing laws specify a range of possible future events. These two positions represent opposite sides of a long-standing dispute within probabilistic philosophy - the dispute over whether chance is an epistem¡c issue (arising from problems associated with predictabil¡ty) or a metaphysical condition (arising from the structure of the cosmos). As an example, consider a single tossed coin. As we know, the probability of it landing on 'heads' is %. However, there is an open-ended question arising from this statement. Namely, if all of the initial conditions were known and all of the factors affecting the flight of the coin incorporated, will the outcome of the toss be completely determined (i.e. if Laplace's demon were tossing the coin, would she/he/it be able to 'guess' correctly every time?). The mechanist invariably answers 'yes' to this question. However his/her position encounters the fundamental problem of justifying why s/he it. Given our non-demonness, it is impossible to justify such an assumption empirically. On the other hand, the non-mechanist, who invariably answers 'no' to this question, cannot justify their position either. Accordingly, it seems hard to avoid the conclusion that rather than being a ver¡fiable (or falsifiable) 'discovery' of modern science, mechanism is ¡ust an extremely popular metaphysical presupposition of modern science.

27 however, for now, it is sufficient to remark that following Hume's critique of causality, Kant suggested that time and space (the primary qualities of the machine) must not inhere in things but in our apprehension of them. Thus, observations of "one type of thing being followed by another type of th¡nq", which invariably lead to causal speculation, become strictly subject driven (Kant, 1781). lf causality is taken'out of the world'and placed'into our minds'then mechanistic metaphysics is undone. The world is not a machine, rather the way modern science has understood it is 'as a machine'!

Such a conclusion seems even more reasonable given recent advances in science itself have been seen to undermine the 'machine' metaphor. Since, the strategy of mechanism is to assume that machines are 'the metaphor' for understanding the world, the observation that certain aspects of the world seem to behave nothing like a machine is highly suggestive of its failure. One such observation arises from elementary particle theory. The current view within this discipline is that the behaviour of elementary particles is irreducibly indeterministic (Dupre, 1993)18. What this means for mechanism on the'macro-level'is far from clear, however, it does suggest that the machine metaphor should not be taken as the self-evident starting point of modern science that it is often thought to be. lndeed, the idea that there exists a causal nexus governing such 'macro' phenomena as behavioural ecology, economics, human psychology, management practice, politics and international relations seems almost absurd. Furthermore, if it is conceded that any of these domains are indeterministic, then the determinism of every domain is threatened by interactionsls.

On the face of it, and notwithstanding its success within modern science, mechanism seems too strict a metaphysical presupposition to govern inquiry. Accordingly, it is argued that rather than global determinism, or global indeterminism, we accept that we have no a priorievidence for assuming either position. Certainly, it is unreasonable to assume (as the mechanisl does) that the future state of any system of interest is strictly (metaphysically) determined by the current state of the system. Furthermore, rather than being inconsequential, this conclusion is of profound importance, having the potential to seriously affect a wide range of public issues outside science and philosophy. These include analysis, planning and policy setting within such spheres as education, health, defence, immigration, the environment and the economy.

tu Dupre (1993) uses the term 'irreducible indeterminism'to suggest an indeterminism that is non-epistemic. That is, it is not a reflection of our ignorance of some'hidden variable'. t" Dupre (1993) uses the example of Newtonian mechanics to emphasis a similar point. One of the reasons why falling objects near the surface of the earth do not obey the simple Newtonian account of behaviour is that they are liable to be affected by winds. However, if the weather is an irreducibly indeterministic system, then the behaviour of falling objects is also irreducibly indeterministic.

28 1.1.3 The Metaphysics of Wholes and Holistic Science

" (from the Greek 'Holos', meaning whole) is the theory, which makes the of wholes a fundamental feature of the world. lt regards natural objects, both animate and inanimate, as wholes and not merely as assemblages of elements or pafts. fltl Looks upon nature as consisting of discrete, concrete bodies and things, and not as a diffusive homoge nous continLtLt m."

- Jan Smuts

Following his articulation of the world hypotheses 'formism'and 'mechanism', Pepper (1942) introduced two others, which he termed contextualism and organicism (a.k.a. holism)2o. Whereas the first two are 'analytic' hypotheses, the latter two, according to Pepper, are 'synthetic' ones. Unfortunately, Pepper never defined what he meant by the analytic/synthetic distinction2l. However, he did state that a consequence of the analytic theories is that they assume that an object (or event) can be effectively understood by decomposing it into its constituent parts. ln fact, this may be all that he meant when he used the term. lf so, then he would probably suggest that a synthetic world hypothesis is the negation of this assumption. Specifically, that there is more to understanding an object (or event) than simply understanding its parts; some understanding of the properties of the whole, as distinct from the parts, is required.

The synthetic world hypotheses introduce the notion of 'emergence'. An emergent is said to be a property of the whole that is qualitatively distinct from the properties of the parts (Alexander, 1920; Broad,lg25)22. Most thinkers adopting a synthetic worldview argue that emergence arises from the organisation of the parts. Such a presupposition is often referred to as a 'systems perspective'. Whilst Pepper himself does not adopt the language of the systems community (indeed, it could be argued that he pre-dates the systems revolution), his articulation of contextualism and organicism should be considered as an extremely important contribution23. Accordingly, this thesis uses typically'systems' language to describe these two positions.

'o ln h¡s book Concept and QualÌty: A Wortd Hypothesrs, Pepper (1967) adds selectivism to his list of synthetic world hypotheses. However, there are few differences between selectivism and conte)dualism, and as such this thesis prefers to use the metaphysics of contexts to describe both positions. 21 Th¡s distinction is by no means without historical controversy. For a more detailed discussion the reader is directed to Section 1.2.4, Footnote 49. 22 See Section 3.1.1 for a more detailed discussion of emergence. " lndeed, it is argued here that the systems community, in their haste to reject what they termed 'reductionist' modes of thought, assumed an impoverished characterisation of the metaphysical space typified by proto-scientific categories of understanding. That is, they chose to categorise the metaphysical space by way of the binary opposition mechanism/holism and the associated epistemic space by way of the binary opposition analysis/'synthesis. This led to a wide-spread rejection of the former in favour of the latter. However, what the early systems theor¡sts appear to have failed to understand is that there are a multitude of possible metaphysical positions that could follow a rejection of mechanism. By assuming that the a pr¡ori metaphysical presuppositions of science could only ever be either mechanist or organicist the early systems theorists essentially locked themselves into what they saw as the only possible alternative to mechanism - organicism. However, if Pepper's (1942) conlribution had

29 According to Pepper (1942) the root metaphor of organicism (or holism) is the organism. The organicist assumes that the world is more or less composed of intrinsic 'wholes' or 'systems'. Therefore, the aim of inquiry is to uncover the 'relationships' between the parts of the system and by so doing understand its 'structure' and, by way of implication, how the structure functions to give rise to emergent properties. The idea is that if a correct understanding of the relationships is to be had then we will be able to predict the results of manipulating various parts of the system on other parts of the system. Moreover, we will be able to predict the results of manipulating various parts of the system on the 'system-as-a- whole'. For example (and continuing the organism metaphor) if we understand how the structure of the physical components of a body produce the non-physical emergent property of life we may be able to intervene in the system and restore or extend life when it is threatened.

Whilst Pepper (1942) does not make the distinction, it is useful to distinguish between 'closed' and 'open' organicist positions. According to the closed organicist position (or closed systems perspective), the world is composed of intrinsic wholes that are distinct from each other in the sense that they do not ¡nteract. The aim of inquiry, therefore, is simply to uncover the structure of these 'wholes' and by so doing understand how to influence them. On the other hand, the open organicist position (or open systems perspective), assumes that the world is composed of intrinsic wholes that do interact with each other. Thus, the aim of inquiry becomes two{old: first, to uncover the intrinsic (internal) structure of the system of interest; and second, to uncover the nature of its (external) interactions. Only by understanding both of these aspects can the system's behaviour be successfully understood and thence the influence of change predicted.

As expected, an organicist metaphysical position leads to a completely different understanding of the nature of scientific inquiry. Whereas the goal of formistic science is to uncover nature's intrinsic categories and the goal of mechanistic science is to uncover nature's intrinsic laws, the goal of organicist science is to uncover the structure of nature's intrinsic wholes. Accordingly, or systems theory becomes the paradigmatic example of organicist inquiry.

Several attempts at developing a 'systems perspective' or 'structural understanding' of nature's 'wholes' are presented in this thesis. These include the theory of open systems, cybernetics, complexity theory and autopoiesis. A common aspect of each of these is the belief that the structure of the 'wholes' that they study is something that is 'given' by the structure of reality. For example, according to the autopoieitician, the theory of autopoiesis is not simply a useful structural understanding of life but the defining feature of all 'living been read and understood at this early stage, then perhaps some of the excesses (and failures) of the systems

30 wholes' and therefore capable of accounting for all forms of human behaviour. This distinction has led to perhaps the most significant problem facing organicist metaphysics (and therefore organicist forms of inquiry); its insistence on the 'intrinsic' nature of the 'wholes' that it studies. According to the organicist, each whole is distinct from its environment (whether it interacts with it or not). Therefore, both the internal structure of the whole and the nature of the external boundary of the whole are 'given' by the structure of reality. Furthermore, the boundary is often thought to be self-evident or at the very least empirically verifiable. Such a metaphysical presupposition has led to some well-documented disasters when 'systems analysts' have made inappropriate boundary judgements (see Chapter 3.2 tor more detail). lndeed, once the boundaries have been 'discovered', organicist inquiry (systems theory) sees itself as little more than functional analysis, with neither the power nor the intention of transcending the context of the 'system'. ln the last quarter of a century a growing number of 'systems theorists' have begun to doubt the organicist assumptions inherent within systems theory. At the forefront of these 'doubters' have been some of the most prominent names in the systems community, including one of the founding figures of Operations Research in the US, C. West Churchman and his students Ackoff, Mason, Mitroff, Linstone, Ulrich and others and some of the most prolific systems theorists in the UK, including Peter Checkland and his growing band of 'soft systems' methodologists and the 'critical systems' community centred around the University of Hull. Whilst various languages have been used to articulate the change in perspective, to my knowledge none have discussed it in terms of a change in metaphysical persuasion or adopted the language of Pepper to make sense of the emerging new metaphysical backing for systems thinking2a. Notwithstanding this, much of the work of these revolutionaries can be seen as being influenced by the growing realisation that systems are not inherent in the structure of reality. lndeed, this realisation is quite explicit in the work of many. What is less explicit is the change in metaphysics that has given rise to this conclusion. That is, the old ontology of real-world objects (systems) has given way to a new metaphysics of irreducible relational complexity. Everything is connected to everything else, and therefore, 'objectification' is obviously a human affair rather than a natural phenomenon. Accordingly, systems are increasingly seen to 'emerge' only in the plural. They live, breath, and have their being, under the contingent boundary judgements of systems thinker(s) and are not previously 'given' by the structure of reality. Acceptance of this leads to the replacement of the organicist view of a transcendentally grounded world of systems, with a new metaphysics of inter-connectedness. Within this new metaphysics, there are an infinite number of possible system-environment characterisations (contexts), all of which reveal something different about the world and none of which have a privileged relationship to it.

community could have been avoided. 2o lndeed, most commentaries choose to discuss recent trends in the systems community in terms of the adoption of new metaphors of organisation, new foundational social theory or new methodologies for guiding practice.

31 1.1.4 The Metaphysics of Contexts and Contextual Science

"Out of what is in itself an indistinguishable, swarming continuum, devoid of distinction (sunyata), our emphasis, our senses make for us, by attending to this motion and ignoring that, a world full of contrasts, of sharp accents, of abrupt changes, of picturesque light and shade. Helmholtz says that we notice only those sensations which are to us of things. But what are things? Nothing, as we shall abundantly see, but special groups of sensible qualities, which happen practically or aesthet¡cally to interest us, to which we therefore give substantive names, and which we exalt to this exclusive stafus of independence and dignity."

-

According to Pepper (1942), the root metaphor of contextualism is an act in its context. These 'acts' are like 'incidents' in the plot of a complex novel or drama. For the contextualist, everything in the world consists of such context-dependent 'incidents'. Whereas other world hypotheses preconceive the world as having an intrinsic orderliness (such as forms, laws or systems), the contextualist argues that interconnectedness is an intrinsic feature of the world. Out of this interconnected whole we select certain contexts. These contexts act as organising frameworks, or patterns, and give meaning and scope to a vast array of detail that, without the organising framework, would be meaningless or invisible (Lilienfield, 1978). Thus, according to the contextualist, the 'context' creates the 'structure' by fusing into unity items that, in other contexts, may appear as discrete entities. Furthermore, within these contexts, meanings emerge in complex strands, or levels, that would disappear without the organising framework. Thus, for the contextualist, Newton's 'mechanistic universe' is only one integrating structure for the study of physical phenomena (neither an absolute nor an objective description of reality). Similarly, the organicist's 'systems' are not 'given' by the structure of reality as, at first, they were imagined to be, but are the products of human thought. ln neither case does the context provide us with a description of the fundamental structure of the world. lndeed, most contextualists would deny that the world has an intrinsic structure that can be grasped - preferring to assume that it is irreducibly interconnected.

Pepper (1942) defines two categories of understanding within contextualism. These are quality and texture. An example of the difference between quality and texture is 'a '. The quality of a sentence is the meaning of the sentence as a whole, whereas the texture is the words and grammatical relations that make it up. The two are inseparable as there is no such thing as texture-less quality, or quality-less texture. Although the quality (whole) always exhibits some degree of fusion of the texture (parts), the whole is not simply the sum of its parts. lndeed, a whole is something altogether immanent (Pepper, 1942)25.|n this sense the

'u The ideas embedded within Pepper's categories of quality and texture are discussed under the concept of 'emergence' in Section 3.1.1 .

32 properties of the whole are qualitatively distinct from the properties of the parts and therefore, analysis of 'texture' can never fully account for 'quality'.

Like organicism, contextualism implies that to understand an object (or event) some understanding of the whole is required as distinct from the parts. As such, Pepper (1942) calls the contextualist position a synthetic world hypothesis (metaphysical presupposition), juxtaposing it to the analytic world hypotheses of formism and mechanism. However, unlike organicism, contextualism denies that parts and wholes are absolute categories and therefore claims a kind of ontological relativity for the structure of the world (Quine, 1969). More precisely, the structure is always relative to the context.

The implications of contextualism for systemic inquiry are revolutionary and cannot be adequately covered here. lndeed, many of the ideas that this thesis unfolds throughout later chapters can be said to be exploring such implications. According to the contextualist, there are many equally revealing ways of understanding the world. Thus, analysis, for analysis sake, is unhelpful because the understanding gained from it is entirely dependent on what strand the analysis follows. As Pepper (1942) states: "what is the good of it [analysis], except as the mere fun of paddling about in the ocean of things?" Serious understanding is never gained by a simple analytical study, but by a recursive process of contextualisation and re- contextualisation. Similarly, serious 'systems understandings' are never gained by a 'functional analysis' of a given system but by recursively playing back and forth between different characterisations of wholes and parts.

According to the contextualist, understanding is always relative to a 'working' context and therefore all knowledge is fragmentary, partial and contingent on such contexts. As such, context creation (system definition, boundary judgement, etc), whether external (system/environment) or internal (parVwhole), becomes central to knowledge formation and, by way of implication, has a significant bearing on action. Contextualistically minded systems thinkers, such as Churchman (1968a, 1970b, 1979), Ulrich (1983; 1987a) and Midgley (2000), understand this and have had the insight to associate boundary judgements with power and argue for an inclusive approach to boundary definition. Similarly, most contextualistically minded systems thinkers demand the use of multiple contexts simultaneously, thereby helping to dissolve the power that a dominant context has over others and, by so doing, broaden our understanding of the problem under study.

Obviously, contextualism blurs traditional distinctions between subject and object. According to the contextualist, the subject is always implicated in the construction of objects because what we know as an object (system) arises out of the contexts that we create. As such, the contextualist claims no transcendental legitimation for the categories of his/her understanding. At first thought this may seem a weakness of the contextualist position.

33 However, it is argued here that rather than being a weakness, it is actually one of the position's major strengths. The absence of any grand claims means that contextualism can never fall foul of sceptical attack (as formism, mechanism and organicism have often done). Furthermore, it is argued (next) that rather than being a culturally infeasible metaphysical position, contextualism is consistent with the intellectual spirit of the age and perfectly positioned to act as the metaphysical backing for a contemporary relhink of systems thinking.

1.1.5 Systems Theory and The Metaphysical Presuppos¡t¡ons of the Age

"Life is not comfo¡table, settling down into pre-ordained grooves of being; at its best, it is'élan vital', inexorably drawn towards higher forms of existence."

- Robert Lilienfield

It has been argued by many that we are currently living in a post-melaphysical age and that modern science has rid itself of the metaphysical influences that characterised pre-modern forms of thought (Carnap, 1928; Reichenbach, 1938; Popper, 1979; Habermas 1988). Furthermore, some have claimed that, regardless of the intellectual pursuit, if we follow the methods of modern science we will end up with knowledge that is metaphysically neutral and therefore free from the values, ideals and judgements of the scientist. This thesis seeks to repudiate both of these assertions. To this end, four distinct metaphysical foundations for the pursuit of (scientific) knowledge are presented. These are by no means the only four metaphysical positions 'on the market', nor should they be thought of as the final word on the matter. However, it is argued that they conclusively demonstrate how modern science has always and will always coexist with metaphysics. Because of this, it is naïVe to assume, as the logical positivists did (see Chapter 2.2),lhal we could excise metaphysical ideas from our knowledge base. lndeed, even Popper (1969; 1972) accepts this, claiming lhal: "theory always precedes observation". Where this thesis deviates from the received Popperian tradition is in its claim that it is just as naiVe to assume that any one metaphysical presupposition is true whilst its competitors are false (as mechanist are prone to do). Perhaps a better way of thinking about these issues is by making use of the notion of 'helpfulness'.

Around the middle of the last century a small intellectual community emerged that, whilst hailing from a variety of disciplinary backgrounds, were united in the broad agreement that the mechanistic metaphysics that had dominated modern science was proving to be increasingly unhelpful. Mechanistic assumptions had led to 'reductionist' attitudes towards such things as ontology, theory development and methodology (see Chapter 2.1). Constrained by this ideology, scientists found themselves having to increasingly simplify the phenomena they were studying so that it 'fitted' within the ideals of mechanism. ln opposition to mechanism, this new community sought to dismantle much of its mantra and attempt to

34 replace it with a more appropriate attitude towards inquiry. What they came up with was a largely holistic/organicist form of inquiry which later came to bear the appellation 'the systems approach'.

Some commentators have argued that the biological background of many the 'pioneers' of systems thinking was a major influencing factor in the highly organicist path taken by the systems community (Lilienfield, 1978). However, another explanation is that organicism (holism, systems theory, structuralism etc) was an idea whose'time had come'and, as such, was in line with the intellectual spirit of the age. Certainly, around the same time many individuals (as well as whole research groups) that were disconnected from the systems community were coming up with similar understandings of the objects of their disciplines. Thus, we have Bogdanov's (1910) Tektology and De Saussure's (1916) both developing typically 'systems' understandings that predated the systems revolution. Also, Levi Strauss's (1963; 1966) Sfructuralism developed a largely compatible system of thought within the human sciences to that of Von Bertalanffy's (1950; 1956; 1971) Sysfems Theory, which arose from within the natural sciences. Thus, by the mid twentieth century, twin revolutions describing themselves in terms of a mistrust of 'reductionism' and 'analysis' and a call to'holism' and'synthesis' were underway.

Whilst the early systems thinkers understood the limitations of the machine metaphor (i.e. mechanistic metaphysics), it is argued that they failed to comprehend many of the other difficulties facing modern 'scientific' forms of inquiry. These additional difficulties centred around the way science legitimated itself as a superior form of knowing. For example, modern science understood itself as superior in the sense that it possessed something called'method' which was a guarantor of something called 'objectivity'. Fufthermore, it was thought that if scientific method were followed, then human knowledge would converge toward something called 'truth' and by so doing provide us with the kind of certainty that we craved. At the time of the systems revolution these epistemic and methodological precepts had coalesced into a philosophy known as and had been co-existing with the metaphysical precepts of mechanism for some time. Unfortunately, the early systems theorists, whilst recognising the decreasing helpfulness of mechanism, did not recognise the problems associated with positivism. As such, the new systems approaches developed for several decades under the 'old' positivist epistemic and methodological assumptions.

However, by the mid 1970s critiques of the positivist (and by this stage post-positivist) epistemic positions began to appear within the systems, and structuralist, communities. lncreasingly, the idea that a certain 'method' could guarantee 'objectivity' and thence 'truth' succumbed to sceptical attack. Within structuralism, a series of post-structuralist critiques were carried out by Foucault, Derrida, Kristeva, Lyotard, Baudrillard, Barthes and others. Around the same time a number of critiques began to appear in the 'systems' and 'operations

35 research' communities. These were advanced by such dignitaries as Ackoff, Churchman, Checkland, Ulrich, Rosenhead, Flood, Jackson, Midgley and others. lt seems that two things were occurring. First, the systems and structuralist communities were progressively moving from an holistic to a contextualist a priori metaphysical position. Second, this shift was having a carry over effect on the epistemic positions (and associated narratives of legitimation) long held sacred by modern science. Whilst the organicist assumption of the intrinsic nature of the 'wholes' that it studied led it to claim a kind of transcendental legitimation for the knowledge it generated, the contextualist position made no such claims. Thus, embedded within the metaphysics of interconnectedness lies the seeds for a rejection of the legitimation narratives of modern science as well.

Just as organicism was an idea whose time had come in the mid twentieth century, it is argued that contextualism is an idea whose time has arrived in the late twentieth and early twentyfirst century. Whilst few are using the term, new contextualist styles of understanding are emerging across a broad range of disciplines. This is changing the way we view such fundamental things as 'representation' and 'knowledge'. lndeed, many have argued that a societal-wide change is occurring. This is where the term 'postmodern' comes into the framework of this thes¡s. The term is undoubtedly problematic, as it has been used to describe a variety of things within many different disciplines. Even worse, it has been kidnapped by popular and incorporated into the jargon of the age. As such, it has been vulgarised and over-used and thereby stripped of nearly all meaning: thus making it a perfect bedfellow for the other topic of this thesis - systems26.

Notwithstanding the difficulties associated with the term 'postmodern', there are a number of ideas within what is generally described as postmodern thought that are important for the purpose of re-thinking systems thinking in a contextualist manner. At this point two paths effectively lay open:

To describe these ideas independently of the postmodern context in which they are often couched.

2. To describe these ideas together with the 'postmodern story' which tells of a broad societal shift unfolding.

Some of the thinkers this thesis reviews have taken the first path, whilst others have taken the latter. One, , has, at times, done both, coming to the conclusion that:

"l have sometimes used 'postmodern' myself, in the rather narrow sense claimed by Lyotard as 'distrust of metanarratives'. But I now wish that I had not. The term has been so over-used

'u lt seems that misery acquaints both men and ideas with strange bedfellows.

JO that it is causing more trouble than it is worth. I have given up on the attempt to find something common to Michael Grave's buildings, Pynchon's and Rushdie's novels, Ashberry's poems, various sorts of popular music, and the writings of Heidegger and Derrida and have become more hesitant about attempts to periodise culture" (Rorty, 1991).

Whilst I sympathise with Rorty, two things have made it difficult to ignore the postmodern narrative. First, it cannot be denied that significant (some claim revolutionary) epistemic changes have been occurring within all of these fields and many more besideszT. Second, many of the principle proponents of these changes are self-consciously attempting to characterise their own contributions within the modern / postmodern narrative.

What is not immediately obvious is whether these changes are just a passing fad, or whether they will come, in time, to rival the enormous period of change that characterised the rise of the modern episteme during the Enlightenment. ln the words of Lyotard (1979) 'ïhe general situation is one of temporal disjunction, which makes sketching an overview difficult".lndeed the difficulty facing serious scholarship of contemporary cultural change can perhaps best be seen in the enormous number of crude caricatures of both modernity and postmodernity in the literature. Certainly, the 300 years from 1650 that is usually afforded modernity do not show one homogenous overarching societal worldview (as much of what is to follow in the next Chapter will show). lndeed, this period saw a continual dialogue between major conflicting opinions over such fundamentals as truth, knowledge, legitimacy and progress'8.

Rather than get involved in a debate that can only have meaning with the benefit of perhaps centuries of hindsight, this thesis takes the view that it is possible that we may be at a defining moment not too dissimilar to the start of the Enlightenment. Certainly, many of the presuppositions of the modern worldview have come under increasing attack. Not surprisingly, the vacuum that these presuppositions have left is currently being crammed with a plethora of competing ideas, none of which have achieved any semblance of dominance. As such, it is possible, that the central presuppositions of modernity will once again rise to take their place as the priviledged a priori melaphysical, epistemic and methodological assumptions of intellectual inquiry. Epoch defining change is certainly not a foregone conclusion. However, if the reader will permit me to be personal for a moment, I would have to say that I would welcome such a change - finding much of the modern scientific worldview flawed, austere and impoverished. Accordingly, I have thrown my voice into the fray in an

27 lnterestingly, Rorty (1991) concludes that'pragmatism'is a better description of his position. However, as the following passage from Charles Peirce (the father of American pragmat¡sm) testifies, even that term was muddled in his day: "it has probably never happened that a philosopher has given a general name to his own doctrine without that name soon acquiring in common philosophical usage, a signification much broaderthan was originally intended". 'u According to this line of argument, if there is an emerging 'postmodernist' break with the past, it may best be characterised not by a sudden and broad change of societal opinion on the answers to certa¡n fundamental questions of the modern period, but by claiming that the modern discourse has 'run its full course'. Thus, the postmodernist sees the modern disputes over fundamentals such as knowledge, truth and legitimacy as fundamentally unresolvable and therefore bows out of the discourse altogether. That is, postmodernism may be characterised not by a new opinion to an old question, but by questioning the usefulness or legitimacy of the old question itself.

37 attempt to help influence the outcome. lt is for this reason, and this reason only, that this thesis has chosen to couch its contribution within the context of the more general 'postmodern story'2e.

Notwithstanding the above, it is important to point out that this is not a treatise about intellectual or cultural epochs, but about ideas. Specifically, this thesis is about ideas important for the systems movement to come to terms with3o. Whereas the schoolmen of the middle ages introduced Aristotle to the scriptures, Hegel introduced Kant to the Romantics and Derrida introduced Hegel himself to Genet, this thesis attempts to introduce postmodernism to the systems approach. To do so, it will need to consciously un-weave one discourse (the epistemic story of postmodernism) and re-weave it within the context of the other (the holistic story of the systems approach). Accordingly, much of the focus of the rest of Part One is on articulating and extending the principle ideas associated with the postmodern story. However, rather than presenting these ideas as a systematic whole, it chooses to present them in the form of a historical conversation sequence running from the Greek thinkers of antiquity, through the Renaissance to Descartes, Newton, Kant, Nietzsche, Wittgenstein, Foucault, Derrida, Rorty and beyond. At various points within this conversational sequence my own critiques will be mixed with those of others in order to steer the conversation in the direction that this work seeks to pursue. At other points in this conversational sequence certain questions of fundamental importance will cease to become so, having 'run their full course', and been replaced by different questions that claim to 'dissolve'the former. lt is at these points that words like pre-modern, modern and postmodern are used.

The conversational sequence presented is not intended to be comprehensive and deliberately glosses over contributions from key conversers such as Hume, Hegel, Heidegger and Habermas, whilst ignoring entirely others such as Berkeley, Bataille, Baudrillard and Barthes. Rather than attempt a comprehensiveness that is doomed to failure at the outset, what follows is an attempt to un-weave a key strand of thought that may, in turn, be re-woven into the context of the systems movement. Undoubtedly, this strand could have been unwoven and rewoven in multiple ways. Thus, the absence of the debate between Kierkegaard and Hegel, for instance, does not intend to imply irrelevance. lt is therefore claimed that this thesis 'practices what it preaches', deliberately scorning the idea of a comprehensive (or holistic) study in favour of the concept of contextualisation and re-contextualisation ... but more on that later.

For now it is time to consider what is meant by the term 'modernity'

2s Accordingly, this thes¡s adopts both polemical and expository styles at various stages. "0 lndeed, if this work is to be remembered for nothing else than introducing a new discourse to the systems movement then it is argued that it has made a valuable contribution.

38 1.2 The Rise and Fall of the Modern Episteme

" is that moment when man invented himself;when he no longer saw himself as a reflection of God or natLtre."

- Robert Cooper and Gibson Burrell

Modernity, or the 'modern world view' is a term that has been very much in vogue with historians, sociologists, artists, the literati and various cultural observers over the past quarter of a century. lronically, and despite the fashionable nuances invoked by the term, the word has been widely used to describe a declining weltanschauung.Thal is, it has been used as a label for the worldview that many argue is currently in the midst of being replaced.

Ever since the Enlightenment, the idea of 'modernity' has been a central theme of almost all western thinkers. Scientists, engineers, architects, novelists and historians all saw themselves as taking part in the grand Enlightenment project of 'modernisation', and constantly defined themselves in opposition to the continuity of traditional ways of life (the so-called pre-modern lifestyles). However, despite such self-awareness, self-characterisation was rarely attempted (thus explaining the recent 'free for all' of postmodern characterisations of modernity). lndeed, characterisation is not without considerable dangers. Whilst modernity (and perhaps modernism) may not be as unstable and misunderstood a term as postmodernity (and postmodernism), it would be optimistic to say that any one characterisation has been generally accepted. Disagreement is to be found on such fundamentals as the duration, the major projects, the major problems and the major contributors of the modern age. For example, does modernity, encompass both the Renaissance (Section 1.2.2) and the Enlightenment (Section 1.2.9)?31 lf so, does it proceed all the way through the twentieth century? lndeed, are we still in modernity (postmodernity being just a periodic self-reflection within the great modern juggernaut - a sort of reflexive modernity)? Are Hume, Kant, Kierkegaard or Nietzsche modern or postmodern? What about the French structuralists and post-structuralists, or the German Romantics, or the Avante Garde movement? ls American pragmatism an early form of postmodernism or some sod of sophisticated defence of modernity? These are questions beyond the scope of this study. Not surprisingly, it is suggested that the answers are highly context specific. Accordingly, the story of modernity and postmodernity that follows is self-consciously selective and contextual, being concerned mostly with the impact of certain epistemic ideologies on science, and therefore, the likely implication of adopting these ideologies within systems theory.

31 Foucault (1966) argues that the Modern Age begins at the end of the Enlightenment.

39 1.2.1 The Origins of the Modern Episteme: The Greek Period

"One thing only I know, and that is that I know nothing."

Socrates

Whilst alternative classifications abound, most historians define the modern era as beginning at Westphalia in 1648 with the signing of the peace accord that effectively ended Europe's Thirty Year War. According to traditional historical accounts, the political stability that followed this act ushered in what is commonly referred to as the 'Enlightenment', or the 'Age of Reason'. However, in order to best understand the changes that occurred during this tumultuous age it is necessary to begin our discussion long before the accord of Westphalia. lndeed, many of the principles that were being articulated during the Enlightenment can be traced to the philosophers of southern Europe during the period 600 B.C. - 200 A.D.

There are countless studies tracing the influence of Greek philosophy on the development of the western scientific tradition, see for example Singer (1941) and Farrington (1949). A detailed analysis of the Greek legacy, however, is beyond the scope of this work. For the purpose of this story it is sufficient to note that a synoptic view of the central issues defining the Greek period yields an adherence to two principle faculties for generating scientific knowledge: sense (observations of the world); and reason (deductive rational discussion about the world). ln light of this, it can be argued that the central debate defining the Greek period was that of the primacy of either sense or reason as a mechanism for generating knowledge of the world.

Among the early Greek philosophers, Heraclitus, Democritus, Empedocles and Hippocrates can be seen as championing the primacy of observation, whilst others such as Thales, Parmenides and Pythagoras argued for the primacy of reason (mathematically expressible if possible). Following these early Greek philosophers, the Athenians: Socrates; ; and Aristotle, developed quite sophisticated systems of thought making contributions 1o the formation of the natural sciences, philosophy and mathematics. However, even these three were divergent on the issues of sense and reason. Socrates and Plato, in the tradition of the rationalists, focussed much of their energy in deriving logical and coherent systems of ideas with little interest in observational validation. Of particular note were Socrates' method (discovery by question and answer) and Plato's belief in the ultimate reality of ideas. Aristotle, on the other hand, argued that ideas must never be separate from their embodiment in objects. ln this sense, he championed the observational school. lndeed, it is perhaps Aristotle's legacy, of all the classical philosophers, which has influenced the nature of the western scientific enterprise the most.

40 The Aristotelian world was seen as a teleological striving of all things to achieve their 'true nature'. The aim of observation was to gain insight into the 'true nature' of the observed phenomena. Much of this worldview remained intact within Western Europe for some 2000 years. lt provided the motivation for the observations of the world by the schoolmen of the middle ages, which in turn eventually led to the scientific revolution. Furthermore, the ideas of Aristotle continued to influence the aims of the scientific enterprise well after the overthrow of the so-called Aristotelian and replacement with the Newtonian one. This can be seen most strikingly in the scientific aim of generalising observational data into 'laws' that describe the supposed 'true nature' of the phenomena under study. Thus, 'mechanism' itself can be seen to have an Aristotelian formative influence. ln contrast (and often in antagonism) to the observational work of Aristotle and others, the Greek rationalists developed the language of mathematics and deductive argument. What was missing in the Greek period, and was to be supplied by the scientists of the late Renaissance and early Enlightenment, was the synthesis of 'sense' and 'reason' by using mathematics to represent observed phenomena (Checkland, 1999). Thus, one of the key differences between pre-modern and modern thought was the concept of representation. Accordingly, the modern intellectual was troubled with ensuring the accuracy of representation. lnitial attempts at ensuring accuracy focussed on 'method': the process by which we generate representations.

1.2.2 The Rise of the Modern Episteme: The Renaissance

'What is not measurable, make measurable."

- Galileo Galilei

'Renaissance' is a French term meaning 're-birth' or'revival' and the historical period usually afforded it was, in a sense, both (Schmitt, 1981). As with almost every periodic characterisation, historians disagree on the precise duration of the Renaissance. However, the broadest tend to begin with the rediscovery of classical Greek thought in the mid thirteenth century and end with the scientific revolution and the onset of the Enlightenment in the mid seventeenth century. Whilst the period is popularly associated with art and literature, important themes associated with science, mathematics, philosophy and engineering also emerged during the time. This is perhaps best demonstrated in the life and work of the most famous Renaissance thinker: Leonardo da Vinci. At once a gifted artist, architect, scientist, medic, engineer, mathematician and philosopher, da Vinci typified the Renaissance ideal of I'uomo universale -the universal, or, all-round man. The skills that da Vinci and his contemporaries acquired in these diverse intellectual pursuits at the height of the Renaissance would have remained unattainable, however, if it were not for the rediscovery of Greek learning that began to occur from the mid thirteenth century onwards.

41 The division of the Roman Empire into an essentially Greek-speaking East and Latin- speaking West by Theodosius in 395 A.D. had a momentous impact on the history of European intellectual life (Shelley, 1982). With the passage of time, knowledge of the Greek language in the West became rare and consequently the writings of the great Greek thinkers became unavailable to most of Western Europe (Grant, 1996). On the other hand, Greek science and philosophy continued to be studied in the eastern (Byzantine) empire, such that by the ninth and tenth centuries when the Arabic-speaking Moslems conquered Constantinople, they found libraries full of Greek manuscripts awaiting them. Most of the texts to survive the siege of Constantinople were dutifully translated into Arabic by Moslem scholars and subsequently extended by successive generations of Arabic-speaking intellectuals.

For the next 400 years the lslamic empire stretching f rom North lndia through the Middle East and North Africa to Spain was the centre of learning and development. The lslamic world produced most of the great thinkers of the time, including al Kindi (801-866), al Farabi (870- 950), Avicenna (980-1037), al Ghazali (1058-1111), (1126-1198) and the Arabic- speaking Jew, Moses Maimonides (1 135-1204). Philosophy, mathematics, science, geography, art and poetry all flourished within the most cosmopolitan empire of the time. Of pafticular importance to the intellectual life of the lslamic world were the works of Aristotle. Al Kindi translated and extended most of the Aristotelian manuscripts to survive the siege of Constantinople, following which Arabic scholars debated the merits of Aristotelian science and philosophy for generations. For an overview of this debate, the reader is directed to al Ghazali's (2000) critique The tncoherence of the Philosophers followed by Averroes (2002) defence, The lncoherence of lncoherence.

Coinciding with the rise of intellectual life in the lslamic world, Europe descended into a period of relative stagnation. Various reasons have been suggested for these 'dark ages', most of them focussing on the massive invasions of 'uncivilised' Celtic and Germanic peoples into areas previously under Roman protection. The transformation of Western Europe from a largely urbane Roman population to a largely rural Germanic population, together with the lack of Greek scholarship and/or Latin translations had the effect of stifling the growth of intellectual inquiry throughout the continent. ln contrast, the lslamic world was sophisticated, urbane, commercial and cosmopolitan. Furthermore, it was held together by a common language, thus enabling the free exchange of ideas from lndia to lberia. Post-Roman Europe had almost none of these advantages, being largely rural and feudal, and perhaps more tellingly, increasingly fractured by a multitude of regional tongues. A consequence of this is that education and study became a local affair and retreated into the monasteries that began to dot the European landscape. However, it was in these monasteries that 'schoolmen' such as Gerbert of Aurillac (later to become Pope Sylvester ll), Adalberon of Laon, John of

42 Auxerry, Thierry of Chartres, Peter Abelard and John of Salisbury began to renew an interest in science and philosophy32.

The intellectual pursuits of the schoolmen of the late middle ages led to a renewed interest in Greek thought, most of which was only available in Arabic and only to be found in lands under lslamic control. At this point, the schoolmen found themselves the beneficiaries of a remarkable historical quirk. At the very moment that they were ready to develop more detailed understandings of the world, the Spanish reconquista won back most of the lberian peninsula from the Moors, revealing a virtually complete body of Arabic learning at the sumptuous libraries of Toledo. Thus began the great age of translations. lntellectually starved scholars from all over Europe came to study the Arabic texts left in the wake of the fleeing Moors. Amongst these manuscripts were to be found Arabic translations of Ptolemy's (100 A.D. - 170 A.D.) Almagest, Aristotle's (350 B.C.) Physics and Metaphysr'cs and Euclid's (330 B.C. - 220 B.C.) Elements and Optics. On top of these, there was also an impressive array of original works from Arabic scholars themselves, perhaps none more important than al Khwarizmi's (1831) Algebra, which simultaneously established the use of zero as a calculating device and the use of pro-numerals in equations to represent unknown quantities33.

Just as they had done some four centuries earlier in the hands of the Moslems, Aristotle's works proved an immediate success in the hands of their new European guardians. ln all, over 2000 Latin translations from the Arabic versions of Aristotle's works have been identified (Grant, 1996). The totality of this body of Aristotelian literature, together with the subsequent extensions and commentaries made by the schoolmen transformed the intellectual contours of Western Europe. According to the Aristotelian cosmology, the moon, sun, planets and stars were thought to revolve around a central, stationary earth. ln order to accommodate Aristotle's conception of a central, stationary earth with observations of the motions of the heavenly bodies, Ptolemy had proposed that the latter were held in orbit by a structure of spheres that enveloped the earth like the layers of an onion. Whilst this conception of the universe was eventually to be replaced by the work of Copernicus, de Brahe and Kepler, the types of questions Aristotle and others were asking in these texts were a radical departure from the theologically-centred inquiry of the middle ages. As such, Aristotle's legacy far outlasted his cosmology.

One scholar, heavily influenced by Aristotle was Thomas Aquinas (1225-1274). Among other things, Aquinas attempted to 'prove' the from Aristotle's theory of causes and reconciled much of the Aristotelian picture of the universe with the theology of the

"2 The foundations of institutional forms of learning in the West can be traced to the rise of these monasteries. Together with the craftsmen's guilds, they formed the basis of the modern university. Thus, by the mid thirteenth century universities flourished in Paris, Orleans and Montpellier in , Oxford and Cambridge in England and Bologna and Padua ¡n present day ltaly. "" Goldstein (1980) argues that al Khwarizmi's Atgebra was actually an interpretation of the Hindu text Siddhanta.

43 Catholic Church at the time. The result was his Summa Theologica, following which Catholic theology and shared a common Aristotelian framework. However, this was always going to be an unhappy marriage and from the beginning tensions began to emerge. ln 1277 the Bishop of Paris condemned the Aristotelian notion that the universe was eternal. A century later, in an attempt to show the inconclusiveness of Aristotelian cosmology (and by way of implication the inferiority of philosophy to theology) Nicole Oresme argued that whereas Aristotle suggested an immobile earth and the daily rotation of the heavens, observations could be interpreted just as well by supposing the contrary (Harman, 1983)34. The effect of these critiques was that almost immediately after the death of Aquinas, the Aristotelian cosmology began to be undermined.

Adding to the weakening hold that Aristotelian thought was having over Europe was the discovery of a body of writing attributed to Hermes Trismegistus3s. ln the hands of Renaissance thinkers, these writings were developed into a worldview that became known as the Hermetic tradition. Stressing 'natural magic' and an 'enchanted' view of nature, the Hermetics of the Renaissance held that matter was impregnated with an active spirit through which celestial forces acted. The aim of natural magic was to grasp nature's hidden powers and by so doing control the natural world. lt was the Hermetics of the Renaissance who first equated the control of nature with an understanding of its secrets. Following the Hermetics, Renaissance thinkers armed with what they considered to be the secret 'laws' of nature attempted to use them in all manner of pursuits for human gain, ranging from alchemy to navigation.

lf not the first, then perhaps the most influential Renaissance thinker to promote the practical benefits of unlocking the secrets of nature was the Lord Chancellor of England under James ll - Sir Francis Bacon (1561-1626). ln many ways Bacon was a Renaissance man in the mold of da Vinci. Often described as an essayist, Bacon was a thinker with a broad understanding of science, philosophy, engineering, politics and theology. Whilst he castigated the magic of the hermetics, Bacon adopted their insistence that the discovery of nature's truths would be accompanied by utility. According to Bacon, armed with the truths of nature, humanity would be empowered with the means of ruling over it. ln this respect his vision laid the foundations for the modern industrial and technological that developed over the 400 years following his death.

Apart from his essays on the potential human benefits of science, Bacon's other contributions lay in the development of method. Criticising the learning of both the Aristotelian philosophers

to Oresme argued that it was possible that observations of planetary motions (previously thought to prove that the earth was stationery) could be consistent with a rotating earth. From this, Oresme concluded that generalisations from observation were necessarily plural and the choice between competing theories (all of which were consistent with observations) was a matter of faith. "u Despite Renaissance claims that the Hermetic writings represented a lost Mediterranean tradition of 'secret knowledge' that predated the Egyptian civilisation, they are now widely thought to be the product of Hellenistic Alexandria during the period 100-300 A.D. (Harman, 1983)

44 (whom he labelled 'spiders' spinning philosophical cobwebs) and the craftsmen (whom he labelled 'ants' gathering information), Bacon urged the student of nature to be as a'bee': both gathering information and systematising it into systems of thought. Thus he urged the synthesis of empirical and rational studies of nature. ln the context of the Greek legacy, Bacon's method may be defined as fhe use of repeated obse¡vation (sense) to generate mathematically expressible laws (the true nature of the objecVphenomena) and the use of symbolic manipulation (reason) to yield predictions and explanations.

Against the observational school of Aristotle, which saw mathemat¡cs as inapplicable to the natural world, Bacon argued that repeated observations should always be systematised within a mathematical framework. Similarly, against the of Plato, which saw the ideals of truth and certainty as attainable only within an ideological framework, Bacon argued that from real world observations to the best explanation of causes (through the logic of abduction) could yield truth about the natural world when verified repeatedly (i.e. the logic of induction)36. Moreover, Bacon was convinced that this method would not only lead to individual discoveries, but show the interrelations of the sciences themselves, thereby bringing them together into a unified whole (Grenz, 1996).

1.2.3 The Pinnacle of the Modern Ep¡steme: The Age of Reason

"Cogito Ergo Sum."

- Rene Descarles

Whilst the rediscovery of classical science by Renaissance thinkers provided much of the foundations for the modern era, it was during the period immediately following the Renaissance, known as the Enlightenment, that the modern worldview arrived in its full glory. As has been suggested, the Enlightenment is often described as beginning with the signing of the peace accord at Westphalia in 1648 and ending with the publication of Kant's in 1781 . The former event provided a social and political context that fostered the pursuit of knowledge, whilst the latter provided an effective challenge to many of the epistemic preconceptions of the age. Although lasting less than 150 years, this period of intellectual revolution managed to usher in a number of completely new metaphysical and epistemic ideologies to those that had reigned in Western civilisation for over a millennium. lndeed, in many ways, the Enlightenment has lived on far beyond the 150 years commonly afforded it, providing us with most of the metaphysical and epistemic ideologies associated with modernity.

36 Difficulties associated with the logic of induction will be explored in Section 2.2.1

45 Grenz (1996) argued that, above all, the Enlightenment was the product of a revolution in philosophy, inaugurated by the 'father of ', Rene Descartes (1596-1650). Descartes was motivated by the idea that, in order for science to distinguish itself from mere superstition, it was necessary to show that scientific styles of inquiry could alone provide certain knowledgett. To begin with, Descartes began by systematically sifting through his own preconceived opinions to establish which of them he could designate as certain, stating:

"Suppose we had a basket full of apples and were worried that some of them were rotten. How would we proceed? Would we not begin by tipping the whole lot out and then pick up and put back only those we saw to be sound?'

However, when Descartes came to 'replace his apples', he found that only one thing was beyond doubt, namely his own existence3s. This conclusion he neatly expressed, by borrowing from Saint Augustine the appellation, Cogito ergo sum (l think, therefore I am). ln reaching this 'bedrock of certainty' Descartes claimed that there is an inescapable logical limit to scepticism and once this has been reached an entire body of knowledge can be reconstructed. lndeed, this is exactly what Descartes (1637) claims he has achieved in his Discourse on Method, where having already doubted sensory information, Descartes reconstructs knowledge of the external world from the inside out. That is, he relies on radical subjectivity to achieve his end point of radical certainty.

A thorough examination of Descartes method is beyond the scope of this study (the interested reader is directed to Descartes (16a1) Meditations on First Philosophy, in particular, Meditations 3 - 6)tt. However, two important points need to be made:

ln his second Discourse, Descartes spells out four rules for properly conducting one's reasoning. The first rule covers the avoidance of prejudice, the second describes the process of analytic reductionao, the third requires an orderly progression from the simple to the complex, and the fourth calls for complete analysis, with nothing

37 The Cartesian focus on knowledge inaugurated a shift in philosophy from the study of being (metaphysics) to the study of knowledge (epistemology). tt Descartes (1641) Meditations on First Philosophy describes what he calls the'illusion'and'dreaming'arguments aga¡nst the certainty of sensory perception, followed by the 'deceiving God' argument against the certainty of the rules of . lt is not until the Second Med¡tation that he reaches what he calls the 'bedrock of certainty', the thinking self. 3t A crucial component of Cartesian reasoning is his claim that the idea of a 'perfect being' could only be placed inside us by such a being. Thus, Descartes claims the ex¡stence of God as the next most fundamental certa¡nty derivable from the 'thinking self'. Once the existence of God is established Descartes claims that the existence of the external world follows. lndeed, God acts as a kind of epistemic guarantor (the idea of the 'God's eye view' will be discussed when dealing with the correspondence theories of truth later). The upshot of this is that without certain knowledge of God, certain knowledge of the external world is impossible. Critics of Descartes have rightly argued that this line of reasoning, far from being foundational (as Descartes claims), is actually circular. lt is difficult to see how Desca¡1es could claim that the existence of an external God is certain when he also claims that certain knowledge of anything external to the thinking self is unattainable without certain knowledge of God's existence. This problem has been labelled as the'Cadesian Circle'. ao ''fhe second was to divide each of the difficulties that I was examining into as many parls as might be possible and necessaty in order best to solve it" (Descartes, 1 637).

46 omitted. ln many respects, the arguments developed in this thesis are a direct challenge to these four rules of Cartesian reasoning. lt is argued that prejudice is unavoidable (i.e. objectivity is a myth), that analysis is sometimes counter-productive, that there is rarely an orderly progression from simple to complex and that the idea of 'complete analysis' is subjective and relative to various boundary judgements made by the analyst.

2. Whilst Descartes allows for the certainty of the existence of physical objects, he claims that the senses are lar too unreliable to provide a true account of them, arguing that: 'ln many cases the grasp of the senses is obscure and confused". ln order to provide a true account of the external world, Descartes 'subjective' strategy finds that the beauty of rational laws "which God has implanted in our sou/s" can provide the type of certain characterisation of the natural world that his project demanded. Thus the messiness of the sensory world is but'apparent' and when seen through the precision of mathematics, the apparent messiness dissolves, yielding the underlying logical structure of the worldal. Once again the arguments developed in subsequent chapters will be deeply critical of Descartes 'mathematisation' of the natural world, claiming that rather than guaranteeing certain knowledge of the world, the Cartesian strategy has only guaranteed certainty within the context of some idealised world of abstractionsot. Consequently, it is argued that certainty (or truth) is an unrealistic goal of science and therefore the nature of scientific knowledge (commonly thought of as a 'list of truths') will need to be rethought.

Discourse on Method has been described by Butterfield (1949) as "one of the most important books in our intellectual history". The principle of analytic reduction found within it has gone from being a simple rule of thumb to becoming one of the most significant ideologies of the modern scientist. Furthermore, Descafies insistence on the use of mathematics, in all its precision, to characterise the apparently chaotic world of sensory perceptions has permeated almost every scientific discipline. Following Descartes, mathematics has been used by rationalists and empiricists alike as either a guarantor of certainty (Descartes and the rationalists) or a method of generalisation and, therefore, guarantor of objectivity (Newton and the empiricists).

Accompanying these intense discussions on epistemology and methodology were advancements in scientific knowledge itself. Central to these advancements (known as the scientific revolution) was a change in cosmology. The first steps can be seen with Copernicus' claim that the earth was not the centre of the universe, which was followed ot Thus Descartes is often referred to as a rationalist, claiming that reason and not observation is the source of all knowledge. As has already been discussed, the rationalist / empiricist divide had been a long standing one, finding its origins in the classical philosophers of the Greek period. a' perhaps summarises this line of thought best when he says: "mathematics may be defined as the subject in which we never know what we are talk¡ng about, or whether what we are saying is true."

47 closely by Kepler's calculations of planetary orbits. Kepler's laws of planetary motion (which accounted for de Brahe's observations of the heavenly bodies in generalisations of hitherto unimagined simplicity) also represented one of the first uses of mathematics to generalise observed phenomena. The high-water mark of this revolution, however, came with the work of lsaac Newton (1642-1727).lf Descartes is considered the father of Enlightenment philosophy, then lsaac Newton must be the father of Enlightenment science.

Newton's universe was a grand orderly machine whose movements could be predetermined by fixed observable laws. The primary vehicle that promulgated Newton's (1687) cosmology was his Mathematical Principles of Natural Philosophy, known ever after as the Principia.The Principia effectively overthrew the (largely Aristotelian) pre-Enlightenment view of the natural world. lndeed, many of the central tenets of modern science first appeared in the Principia. Ol these, perhaps the centre-piece was Newton's characterisation of the three fundamental laws of motion:

N1. Every body continues in its state of rest, or uniform motion in a straight line, unless it is compelled to change that state by forces impressed upon it. N2. The change of motion is proportional to the motive force impressed; and is made in the direction of the line in which that force is impressed. N3. To every action there is always imposed an equal reaction; or, the mutual actions of two bodies upon each other are always equal, and directed to contrary parts.

The reigning Aristotelian worldview held that all earthbound objects ultimately acquired a state of rest unless acted upon by some motive force. Thus a force was required to keep things moving, even at a constant speed, rather than cause something moving at a constant speed to stop. Newton's first law, the so-called law of inertia, was in striking contradiction to something taken as self-evident for the past 2000 yearsot.

Notwithstanding Newton's disciplinary contributions in physics and mathematics, perhaps the most far-reaching legacy of Newtonian science was the profound shift he inaugurated in the way of approaching science. As we have seen, medieval science was dominated by the Aristotelian need for observation. However, following Descartes scepticism toward the objects of sensory perception, the rationalists of the early Enlightenment period chose to retreat into

€ lt is wodh not¡ng that the law of inertia was actually formulated by Bene Descades without any experimentation through his method of 'pure reason'. Similarly, Galileo demonstrated a clear understanding of the lirst law when he noted that an object sliding along a level sudace did indeed come to rest, but that when pushed with similar force along a more slippery surface it would travel further before stopping. From this observation Galileo imagined an intinitely slippery surface (i.e. one without friction) and argued that if an object were to be given a push along such an imaginary surface it would never come to rest. Thus, contrary to the Aristotelian physics of his time, Galileo argued that a force was required to stop uniform motion in a straight line. lndeed, Galileo also demonstrated some understanding of Newton's second law by timing the fall of similarly shaped objects of different weights. Galileo noted that, contrary to Aristotle's physics, all of the objects fell at more or less the same rate (around 1Oms-'). Again Galileo used a similar thought experiment to that of his infinitely slippery sudace, this time imagining an infinitely slippery space (the famous point mass in a vacuum argument). By eliminating air resistance, Galileo argued that all objects regardless of weight or shape fall with uniform acceleration.

48 the certainty of mathematics, and argued that reason, not observation, was the source of all knowledge. Newton on the other hand was very much an empiricist, insisting along with the Aristotelians that empirical 'observation', rather than rational 'speculation', was the source of all knowledge. Where he differed with the Aristotelians was in the method of generalising observations. Whereas the Aristotelians typically emphasised observing the felos (purpose) of objects in a qualitative manner, Newton pioneered the use of mathematics to generate 'laws of nature', being generalisations from specific empirical observation. Thus Newton's science embodied the synthesis of reason and observation that Bacon had argued for. However, in turning to this view of the scientific enterprise, Newton and his contemporaries narrowed their focus of interest and began to treat as real only those aspects of the universe that were measurable. Following Newton, Enlightenment thinkers began to apply this new ideology to all areas of inquiry. Not only the natural sciences, but the humanities, politics, , art, philosophy and even theology emerged as candidates for quantification and measurement.

One of the dominant principles behind the Enlightenment's twin revolutions in philosophy and science was the elevation of reason over superstitionaa. ln this sense, the era is also known as 'The Age of Reason'. Closely related to this principle is the idea of method. Enlightenment thinkers believed that the truths of the universe could be accessed by the application of the one true and everlasting methodou. Furthermore, this method could be applied to any object of inquiry and yield 'obseruable' laws. lndeed, due to the Newtonian belief that the universe was an orderly machine, the search for some governing law (or equation) came to characterise the entire scientific enterprise. Perceiving themselves as at the vanguard of a revolution from scientific error to scientific truth, the Enlightenment thinkers implicitly believed in progress. Bacon, Descartes, Newton and others believed that the discovery and application of the indubitable laws of nature would have practical significance. lf humans could discover nature's secrets, then utopia would dawn. The conjunction of each of these ideas made for a period of immense optimism, as expressed by lsaiah Berlin (1956):

'"The eighteenth century is perhaps the last period in the history of Western Europe when human omniscience was thought to be an attainable goal."

The Enlightenment ushered into existence what cultural theorists term the modern worldview. The modern world has been variously described. However, in the context of the preceding discussion, it may be summarised as Newton's mechanistìc universe populated by Descaftes'autonomous, rational subject, who is armed with Bacon's scientific method, and hence capable of discovering nature's truths and ushering in a of progress.

ø As Kant (1784) was to say: "Enlightenment ¡s man's release from his setf incurred tutetage". As such, it represented man's ability lo "make use of his reason withoul direction from anothef'. ou Enlightenment thinkers did not always agree on the precise characterisation of this method, as evidenced by the split between rationalists and empiricists. However, both agreed that, when followed, proper method should yield the laws of nature.

49 From Francis Bacon to the mid twentieth century, the goal of the modern intellectual has been to unlock the secrets of the universe in order to master nature for human benefit46. This worldview produced the modern industrial society and the associated modern industrial economies that followed the Enlightenment. lndeed, it is only recently that we are beginning to see both of these supplanted by a new type of world, namely a postmodern world, characterised by a move away from industrialisation (with its focus on mechanistic knowledge) and towards information based (with their focus on 'meaning' and 'sense making').

1.2.4 The Decline of the Modern Ep¡steme: Kant and the Critique of Subject-Object Dualism

"l therefore found it necessary to deny knowledge, in order to make room for faith."

- lmmanuelKant

Notwithstanding the almost universal belief that scientific method could, should and would generate 'scientific' laws, a number of methodological disputes erupted during the Enlightenment. Chief of these was the dispute between the rationalists and the empiricists. The nature of this dispute is perhaps best captured in the Liebnitz-Clarke Correspondence, published in 1717 (Clarke & Leibnitz, 1717). The real protagonists in this wide-ranging dispute were Leibnitz and Newton, for whom Samuel Clarke acted as spokesperson (Gardner, 1999). Each advocated a different method for arriving at scientific theories about nature. For his part, Leibnitz championed the rationalist school prevalent in continental Europe, which advanced the idea that the foundations of all knowledge were the innate ideas of the mind, that is, reason. As such he employed a deductive method of inquiry derived from Descartes, which began with abstract mathematical notions and worked down to concrete nature. Newton, in contrast, championed the empiricist school prevalent in Britain, which advanced the idea that the foundations of all knowledge were the perceptions of the senses, that is, observation. As such he advocated an inductive method derived from Bacon, which began with quantitative measurements and worked upwards to 'first principles' (explained mathematically). The high point in the controversy came when Leibnitz's reasoning arrived at conclusions about the structure of reality diametrically opposed to those that Newton arrived at by observation. This perceived paradox was of sufficient concern for a small group of sceptical philosophers to begin to denounce some of the presuppositions of the 'Age of Reason'.

ou This is the emancipatory thrust of modernity that Habermas (196Sa) discusses and attempts to reinvent through 'critical theory'.

50 One of the first to articulate the growing sceptic¡sm was David Hume (1711-1776). ln 1741 Hume published his Treatise of Human Nature. ln it he argued that the reliance on the method of induction, which had characterised the empiricist school of the Age of Reason, was indeed, unreasonable. Moreover, Hume (1741;1748) was sceptical about any attempt to associate reason with nature (as the rationalists attempted to do), claiming lhali "our beliefs about the external world have no foundation in reason and repose entirely on habit or custom or experience". Hume's contribution is discussed in length in Section 2.3.3, so for now it is sufficient to note that Hume's scepticism was not well received at the time of its publication. lt cast serious doubt on some of the most cherished beliefs of the Enlightenment and was criticised for not offering a viable alternative. Despite the frosty reception it received, Hume's writings did not go unheeded for long and were credited with awakening the creative genius of a man who completely reformulated the modern worldviewaT. This man was lmmanuel Kant (1724-18O4). lmmanuel Kant's life was spectacularly uneventful. He was born, studied, taught and died in the same place: the East Prussian port of Konigsberg (now Kalingrad, Russia). He never married or travelled and it was not until he was 57 that he published his first book Critique of Pure Reason (Kant, 1781). Yet the ideas embedded within it, initiated the beginning of the end for modernity and created an intellectual tidal wave, the effects of which are still being felt (Grenz, 1996)48. ln his Critlque of Pure Reason Kant sought lo "institute a tribunal which would assure to reason its lawful claims, and dismiss all groundless pretensions". To this end he drew three distinctions: a priori vs. a posferion, necessary vs. contingent, and analytic vs. synthetic. According to Kant a is knowable a príori if it is knowable without relying on experience (observation), otherwise it is d posteriori. Furthermore, a proposition is necessary if it is true, not just in the actual world, but in any possible world. Otherwise it is contingent. Thus a proposition is necessary only if it is knowable a priori, and contingent only if it is knowable a posterion. The third distinction Kant proposed related to the position of the predicate. An analytic judgement is one in which the predicate belongs to the definition of the subject concept (or can be derived from the subject concept using only definitions and logical laws). Thus the denial of an analytic judgement is a contradiction and, therefore, all analytic judgements are justified by this principle (the principle of contradiction). An example of an analytic judgement is the claim that 'a triangle has three sides'. Clearly the concept of three- sidedness is contained in the definition of triangularity. lt is for this reason that Kant argues that analytic judgements never extend knowledge but merely explicate concepts. On the other

a7 Kant attributed a "recollectÌon of Hume" lo "interrupting his dogmatic stumbel'. lndeed, in his Prolegomena to any Future Metaphyslcs, Kant claims that Hume was the first philosopher ever to identify the serious difficulties facing metaphysics: "since the origin of metaphysics so far as we know its history nothing has ever happened which could have been more decísive to its fate than the aflack made upon it by David Hume". ou lndeed, Scruton's (1995) A Short History of Modern Philosophy describes the critique as'. "a work of an inteltectual depth and grandeur that defies description".

51 hand, a synthetic judgement is one in which the predicate lies outside the subject concept. lf the predicate is not contained in the subject the judgement must be justified by something other than the principle of contradiction. Kant claims that such judgements rest on a synthesisae. That is, a bringing together of elements not previously joined. For example, the judgement that 'all bodies are heavy' is synthetic because the concept of 'weight' is not contained in or implied by the concept 'body'.

According to Kant's precursors (notably the empiricist, Hume, and the rationalist, Leibnitz) all necessaty a priorijudgements were necessarily analytic. That is, a prioriknowledge was only possible for analytic and, as such, most of the traditional propositions of metaphysics (e.9. the proposition that 'every event has a cause') should be dismissed. However, Kant was convinced that there were important classes of propositions that were both a priori and synthetic and, therefore, refused to dismiss metaphysics.

As part of his defence of metaphysics, Kant invoked the 'good company argument', including metaphysical propositions in the same class of propositions as mathematical ones. That mathematical judgements are necessary and a prioriis obvious, but that they are synthetic is not. Leibnitz supposed that arithmetic propositions, such as 27+12=39, are analytic. That is, they are justified in the same way that the proposition 'a triangle has three sides' is justified: by the principle of contradiction. However, Kant argued that the concept SUM[27,12]does not contain the concept of the number 39. Synthesis is required to make the connection between the concept SUM[27,12] and the predicate 39.

Notwithstanding the insight of Kant's good company argument, the central question of "how metaphysicalpropositions could be possible?"remained unanswered. That is, if metaphysical

4e The distinct¡on between analytic and synthetic proposition has been much debated. The definitions presented here are Kant's reinterpretation of the Lockean d¡stinction. Kant reiterates Locke's account of concept-containment but introduces the notion that analytic propositions are propositions whose denial is contradictory. This characterisation suggests that analytic propositions are also a species of logic; involving some form of logical-containment as well. Following Kant's introduct¡on of logical-containment to the Lockean idea of concept-containment most analytic philosophers have attempted to associate analyticity solely with logic and rejectthe Lockean aspects. Frege (1884) argued that concept-containment was defective in that it did not cover all cases of analytic propositions and that it explained analyticity in unacceptably psychological (rather than logical) terms. Thus he defined analyt¡c proposit¡ons as consequences of logical laws and delinitions, and removed the idea of concept-containment completely. Following Frege, Carnap (1947) took the remaining step of associating analyticity entirely with logic by removing Frege's reference to definitions (Dancy & Sosa, 1992). Carnap's conception is the analyticity of the Aufbau and the logical positivists who attempted to deflate Kant's claim that synthetic a pr¡ority is possible and argued that alleged synthetic a priori truths were merely empty analytic truths. However, the story by no means stops with the pos¡tivists. ln the later twentieth century there has been somewhat of a reversal of the logicism of Frege and Carnap. First, Quine (1953a,b) criticised the Carnapian concept of analyt¡city as both irrelevant and vacuous and argued that the Carnapian conception cast doubt on the entire analytic/synthetic distinction in the first place. Second, Putnam (1975a; 1981) provided a devastating critique of the entire program of construing analyticity as a logical concept by providing counter examples to the central tenets of the Frege-Carnap thesis. The success of Quine and Putnam's criticisms have seen a general abandonment of the attempt to associate analyticity with logic in recent times. Thus, analyticity is once again a hot topic and there are several attempts currently underway to redefine the concept in more Lockean-Kantian terms. Notwithstanding the current state of flux, this work will persist in describing analyt¡city (and therefore its anti-thesis, syntheticity) in largely Kantian terms. The justification for persisting in this manner is two{old. First, the state-of-the-art in analytic philosophy is also re-discovering Kant and Locke. Second, and perhaps most importantly, this is not a thesis on analyticity and therefore the arguments that follow are not ¡nfluenced one way or the other by its definition. As we shall see, the analytic/synthetic distinction is introduced to describe the principle problem of Kant's Critique of Pure Heason. That is, how are synthetic a priori propositions possible? lt is Kant's

52 propositions are possible in the same manner that mathematical ones are, then the question remains "how are all mathematical propositions possible"? Or, more tellingly, "how can metaphysics (including mathematics) apply to the natural world"? Accordingly, Kant made this the centre-piece of his Critique. ln answering this question Kant proposed a bold hypothesis: the mind is active in the knowing process. Or, in Kant's (1781) words:

"lf intuition must conform to the constitution of the objects, I do not see how we could know anything a priori; but if the object must conform to our faculty of intuition, I have no difficulty in conceiving such a possibility".

Kant described this change of paradigm as the 'Copernican revolution in philosophy'. Just as Copernicus explained the 'apparent' motion of the sun in terms of the movements of the observer on earth, Kant explained our knowledge of 'apparently' independent objects in terms of our mode of . Accordingly, many of the observed features of objects are explained by reference to traits of the observer rather than traits of the objects 'in themselves'.

According to Kant, the mind has certain a priori intuitions, which are implicated in every observation of reality (the a priori lorms of pure intuition). Thus, we do not derive knowledge from sense experience alone (as the empiricists would have). The senses merely provide 'raw data' which the mind subsequently systematises. ln particular, Kant argued that 'space' and 'time' do not inhere in 'things in themselves', but are a part of the ordering that the mind imposes on the world it encounters. That is, space and time are pre-supposed for the experience of objects rather than derived from them. This position, which Kant called the transcendental aesthetic (but quickly become known as spatiotemporal construction) is in stark contrast to the position of the Enlightenment scientists who argued that space and time were 'real'existences. According to Kant, space, time and objects with spatiotemporal properties are transcendentally ideal. That is, they are not pure objects of the world, but arise from our sensibly constrained mode of apprehension of it (Allison, 1986).

As well as implicating the mind in every observation of reality, Kant implicated it in every description and interpretation of reality. According to Kant, the mind possesses certain a priori concepts (the a prioriconcepts of understanding), which are active in the knowing process even after we have experienced objects sensibly. Thus, our understanding is not derived from pure reason alone (as the rationalists would have) but from reasoning that makes use of the minds a prioriconcepts of understanding. This position, Kant referred to as the transcendental analytic. Moreover, it led him to distinguish between objects understood via the a priori concepts of the knower (which he termed phenomena) and the real, unconstructed object-in-itself (which he termed noumena), leading to the well-known two- worlds interpretation of Kant.

'Copernican shift' that occurs whilst attempting to answer this question that is important for our purposes and not the

53 Despite the simplistic explanatory appeal of the two-worlds interpretation of Kant's Transcendental Analytic, it is argued here that it completely misses the point. Specifically, that noumena are not 'in a different world' to phenomena. One way to understand this subtle distinction is by invoking the'cookie-cutter' metaphor:

The dough (thing in itself, noumena, etc) is independent of the cook (us). However, the cook imposes cookie cutters (a priori concepts) on the dough in order to create cookies (appearances, phenomena, etc).

Thus, there is one world. But this world must always be taken in a two-fold sense as being composed of appearances (Erscheinung) and things in themselves (Ding an sich se/bst). The nature of Kant's'Copernican strategy' is illustrated in Figure 1 below.

According to Kant, all knowledge of objects (both sensible and rational) has both a subject- constructed component and an object-constructed component. Furthermore, we can never separate out the subject-constructed component and thence subtract it from our knowledge of the object in order to be left with pure knowledge of 'things-in-themselves'. One of the implications of Kant's 'Copernican shift' is that the object / subject distinction previously held to be so important becomes obsolete. ln one fell swoop, Kant dissolved the radical subjectivity of the Cartesian rationalists as well as the naive objectivity of the Newtonian empiricists. Against the rationalists he argued that all knowledge of objects is observed and against the empiricists he argued that no observation of an object could reveal its intrinsic (or mind-independent) properties. The intrinsic properties will necessarily be unknowable and hence science can only reveal knowledge of things as they are to us, and never knowledge of things as they are in themselves. Transcendental , therefore, places strict limits on scientific knowledge, summarised by Kant as:

'Wir sehen das innere der dinge gar nichte ein." (We have no insight whatsoever into the intrinsic nature of things.)

Moreover, introduces an important variation of the Cartesian 'self'. Rather than viewing the 'thinking self' as one of several 'thinking ' within the world, Kant envisioned the 'thinking self' as implicated in creating the world. That is, the world of its own knowledge.

details of how he arrived at this particular problematic

54 Rationalist Accounts of Knowledge

(Subjective) \-/

The subject, S, impresses itself upon the object, O, ln an ¡mpartial manner, using'reason'. lf O conforms to S's reasoning then S'knows'O (deductively) and S's representat¡on of O will lie ultimately in O 'being that way' Had O been otherwise, S would have represented O difÍerently.

Empiricist Accounts of Knowledge

(Objective)

The object, O, impresses itself upon the senses of sub¡ec\ S, in atranscendentally ¡somorph¡c manner. lf S can 'observe'O's behaviour repeatedly, then S will 'know'O (inductively) and S's representat¡on of O will lie ultlmately in O'being that way'. Had O been otherwise, S would have represented O dilîerently.

The Kantlan Account of Knowledge

(T ranscende ntal i deal ism)

Subject, S, brings to the obsevatíon of O certain a prioi concepts, which inhibit O impressing itselÍ upon S in a transcendentally isomorphic mannen lhus, Sb representation of O (and indeed what ¡s thought to be O itself) is in some way subject constructed.

Figure 1: The Nature of Kant's 'Copernican Sh¡ft'

55 Kant's Critique has often been described as a final articulation of modernity, due to his presumption that, in important matters, all 'thinking selves' are essentially the same (i.e we all posess the same a priori forms of pure intuition and the same a priori concepts of understanding). This assumption is known as the doctrine of transcendental unity of apperception and allows that all thinking selves create the same world. Thus the modern ideal of absolute knowledge, may be reformulated into a pseudo-modern public knowledgeuo. However, it may be more useful to view Kant's Critique as a bridge between the modern and the postmodern. There can be no argument that the hypothesis that the a priori concepts in the mind are implicated in both the observation process and the knowing process irreversibly changed modern philosophy and, in particular, the modern illusion of objectivitysl. The only remaining step between Kant and the postmoderns was to acknowledge the diversity of 'thinking selves' and hence the diversity of 'created worlds'sz.

1.2.5 The Fall of the Modern Episteme: Nietzsche and the Rise of Perspectiv¡sm

"l know my fate. One day my name willbe associated with the memory of something tremendous - a crisis without equalon ea¡1h, the most profound collision of , a decision that was conjured up against everything that had been believed, demanded, hallowed so tar. I am no man. I am dynamite!"

-

The distinction of firing the first volley against the doctrine of the transcendental unity of apperception fell to Georg Hegel (1770-1831). Hegel (1807) was deeply influenced by Kant's critique of reason and his analysis of subject construction. However, he drew a conclusion from Kant's work that Kant was unable (or unwilling) to. Whereas Kant claimed that the a priori calegories of interpretation were transcendental, Hegel claimed that they were to The modern interpretation of Kant claims that given the transcendental unity of apperception (i.e. given that the a priori structure of the mind is uniform), knowledge becomes a strictly a poster¡ori affair and the pre-Kantian ideal of transcendental objectivity is reformulated into some form of consensus model of objectivity. Consensus models have formed the basis of such pseudo-modern thinkers as Charles Peirce (1932) and Jurgen Habermas (1981a,b; 1985). tt lndeed, the Kantian hypothesis completely denies one of the central tenets of modernity. That is the Cartesian assumption of subject-object dualism. ut This overv¡ew of Kant deliberately focuses on the 'effects' of Kantian thought within the historical genealogy presented here and does not seek to present the systematic structure of the Kantian corpus. However, for a deeper understanding of Kant the reader is encouraged to come to terms with the entire structure of his thought. For a (slightly positivistic) discussion of Kant the reader is directed to Gardner (1999). However, if the slight positivism of Gardner is unacceptable, then the reader is directed to Kant's (1783) very readable Prolegomena to Any Future Metaphysics. This thesis attempts a somewhat different interpretation to that of Gardner's in the Section on Critical Systems Heuristics (3.3.3). The main d¡stinction between the position taken here and Gardner's (1999) is in the interpretation of Kant's Transcendental Dialectic - a Kantian position not mentioned in the overview presented above. Whereas in the Transcendental Analytic, Kant argues against Hume by supporting Leibniz's claim IhaI a priori reason is necessary for knowledge, in the Transcendental Dialectic, Kant turns against Leibniz and supports Hume's claim that objects must be experienced in order to be known. The point of the Dialectic according to positivistically inclined commentators is to show that only objects of experience can be known and, therefore, metaphysical speculation is meaningless. However, the interpretation offered here (Section 2.4.3) suggests that the point of the Dialectic is to explain our propensity towards metaphysics and, by setting up unknowable ideals, help us avoid objectivist illus¡ons.

56 historical. According to Hegel, the limits to reason, which Kant's Copernican shift uncovered, were furthered by the fact that the a prioriconcepts were embedded in a historical context (something akin to Pepper's root metaphor process). Thus Hegel postulated an historically contingent element to reason, knowledge, and truth and by so doing steered postKantian discourse towards postmodernism. Hegel's claims to historicity actually flow quite naturally from Kant's work. From Kant, Hegel learned that the subject is implicated in the construction of knowledge of objects. However, he added to this the insight that subjects are historical creatures, incapable of fully transcending their historical location. Thus, Hegel brought Kant's transcendental idealism together with the ideas of historical relativity that were being promoted by the 'Romantic Movement' of Herder and Humboldt during his time. Following Hegel, the subject was situateds3.

By situating the subject in an historical context, Hegel's key problem changed from 'how is knowledge possible?'to 'how can we move from a paftial (historical) perspective to complete (absolute) knowledge'? However, despite the change in emphasis, Hegel's problematic remained the traditional (modern) one of finding a path to truthsa. His solution, on the other hand, was anything but traditional. According to Hegel, the answer is history itself. That is, the dialectical progression of social revolution, philosophical reflection and contingent knowledge-claims would end in the highest level of consciousness, the attainment of absolute knowledge. Thus, history told the story of the eventual liberation of man from his subjectivity. ln this, Hegel was deeply influenced by aspects of lndian and Oriental philosophy, and defined absolute knowledge as "an all inclusive synthesis of the whole".

Perhaps the first to despair of Hegel's belief in history providing a path to truth (certainly within recent European thought), was Friedrich Nietzsche (1844-1900). ln Nietzsche, one gets Hegelian historicity and perspectivism without the need to propose a solution to the traditional modern problematic. According to Nietzsche, Hegel attempted to find in history the kind of certainty that Plato hoped to find in mathematics, Descartes hoped to find in subjective (rationalist) foundationalism and the positivists of his time were beginning to look for in a unified empirical science. Nietzsche, on the other hand, rejected the quest for certainly and suggested that a re-orientation in Occidental conceptions of truth, knowledge, science and progress (among other things) was necessary.

Nietzsche was trained as a classical philologist and appointed as professor of philology at Basel University at the astonishingly early age of 24. His introduction to philosophy came

* For a more thorough study of Hegel's position in relation to Kant's the reader is directed to Hegel's critique of Kant in Phenomenology of Sprnt (Hegel, 1807). A more recent explication of the situated subject is advanced by the anthropologist (1956). According to Whorf, human cognitive processes do not possess a common logical structure that operates prior to and independently of language. Rather, linguistic patterns themselves determine what the individual perceives in the world and how s/he thinks about it. Moreover, since these patterns vary between different groups, then different modes of thinking and perceiving in each group will result in largely different . s Hegel simply rejected the traditional account that'objectivity' conjoined with 'method' provided absolute knowledge.

57 through his discovery of the works of Schopenhaur (1819; 1844), following which he embarked upon a remarkably productive period of chiefly philosophical work. lndeed, in the 17 years form 1872 to 1889, Nietzsche published over 20 books and countless essays. Unfortunately, almost immediately after the publication ol Twilight of the ldols, in January 1889, Nietzsche collapsed whilst on holiday in Turin and entered into a period of complete physical and mental decline (probably of syphilitic origin) until his eventual death in 1900 at the age of 44 (Schacht, 1995).

Notwithstanding Nietzsche's prolific rate of publication, making sense of him is a tricky business. lndeed, over the years many have made a kind of sense of him that borders on the horrendous (Schacht, 1995). For example, during the 1930s and 1940s Nietzsche seemed to be the property of Nazi intellectuals, who claìmed that Nietzsche's idea of lhe Ubermensch signalled a call to arms for the Teutonic race. Even after the war it was common to think of Nietzsche as somehow related to both German militarisation and the Jewish holocaust. However, such a characterisation was a complete travesty and Walter Kauffman's (1950) critique of the 'Nazi Nietzsche' went a long way to righting this wrong.

Kauffman (1950) argued that Nietzsche was not the proto-Nazi bogie man that he was often seen to be, but the apostle of Hegel and Kierkegaard and the main nineteenth century advocate of . However, whereas Kierkegaard's existentialism was ultimately grounded in his Christian faith, Nietzsche's version was thoroughly secular. According to Kauffman, Nietzsche demonstrated that existentialism was independent of religion; it could be consistent with both Kierkegaard's fervent defence of Christianity and Nietzsche's fervent criticism of it. Following Kauffman, the ghosts of the proto-Nazi bogie man were effectively exorcised. Whilst his image of the existentialist Nietzsche did not stand the test of time, his of the 'Nazi Nietzsche' opened the door to a new interest in Nietzsche's writings and a swathe of new interpretations. At the end of the day, Kauffman made Nietzsche respectable. lt took an American Jew whose family was victimised in the war to put to death old wartime caricatures and re-introduce Nietzsche to analytic, continental and pragmatic philosophers alike.

Of the more recent, post-Kauffman, interpretations of Nietzsche, two stand out as remarkable and deserve a special mention. First, Danto (1965) in his book Nietzsche as Philosopher shocked the entire philosophical world by making the claim that Nietzsche was actually a precursor to analytic philosophy. According to Danto (1965), Nietzsche made important contributions to traditional (modern) philosophical problems such as the nature of truth, knowledge, science and progress. Moreover, these contributions can be seen as a pre-cursor of the '' made by post-war analytic philosophers, who despaired of finding a purely epistemic theory of knowledge and therefore attempted to ground knowledge in a . The attempt to cast Nietzsche in such a role was bold but ultimately

58 doomed to failure. Only a handful of analytic philosophers welcomed Nietzsche to the fray and most continued to reject him as the enemy. The 'proto-Nazi' enemy of and democracy had simply been replaced by the 'postmodern' enemy of truth and progress.

Whilst most contemporary analytic philosophers continued to characterise Nietzsche as the enemy, a new wave of Nietzsche followers found his denial of truth, knowledge and scientific progress as heralding a new era of intellectual thought. These are the so-called 'New Nietzscheans', 'post-Nietzscheans', or more generally, 'postmodern' theorists (Allison, 1977; Behler, 1988; Behler&Taueneck, 1991;Clark, 1990; Deleuze, 1983; Heidegger, 1979, 1984, 1987,1992; Schacht, 1995). According to the postmodern interpretation, Nietzsche did not contribute substantially to traditional philosophical problems, but attempted to dismiss the entire enterprise, and, by doing so, proclaim the coming of a new ageuu. Thus, the analytic detractors are right to consider Nietzsche as revolutionary, but wrong to dismiss him because of this. Nietzsche, as the postmoderns understand him, shows the futility of any attempt to ground knowledge (including the analytic attempt to do so in language). Accordingly, Nietzsche is seen as the enemy of foundationalism, whether it be the Cartesian 'thinking self', the Kantian 'transcendental unity of apperception' or the Analytic 'linguistic turn'56.

Following Kant's critique of subject-object dualism, it had been the goal of modern philosophy to propose a 'theory of knowing', or epistemology, capable of overcoming the problems associated with subjectivity and legitimating the modern scientific claim of 'progress toward truth'57. Nietzsche (1879; 1885; 1886;1887a;1889) is highly critical of this philosophical project and ultimately of Kant himself. According to Nietzsche, Kant's 'Copernican shift'was not the revolution it claimed to be, but a brilliant, desperate and ultimately failed attempt to save the Enlightenment ideal of a transcendent measuring rod (a.k.a. Kant's transcendental unity of apperception). ln contrast, Nietzsche's writings continually emphasised the 'loss of the transcendent'in all of its incarnations. Nietzsche (1885; 1889) sees the loss of the transcendent as a fundamental shift in the history of Occidental thought and captures this through the metaphorical proclamation: "the death of God". According to Nietzsche, the West was being inexorably drawn to the realisation that the ideals, rules, norms, principles etc that had previously been seen as foundational or transcendent were becoming increasingly unbelievable. Accordingly, Nietzsche argued that these principles could no longer give other aspects of human activity meaning and legitimation. Furthermore, Nietzsche (1879; 1885;

uu The New Nietzsche interpretation has been largely associated with continental philosophers such as Foucault and Derrida. However, contemporary pragmatists, such as Richard Rorty (1991b) agree with th¡s interpretation, arguing that could equally (and perhaps more appropriately) be labelled post-Nietzschean. "o Foundationalism ¡s an epistemic meta-theory concerning the structure of knowledge. An epistemic theory can be said to be foundationalist if it is structured hierarchically such that knowledge of higher levels is dependent on lower levels and ultimately on some fixed foundation. Foundationalism attempts to fix points of reference that may be used as a foundation upon which truths may be certified. The classical foundationalist position in regard to knowledge is that advanced by Rene Descartes who argued that certainty can be assured from the indubitable foundations of the 'thinking self'and the application of 'scientific method'. ut We have already discussed Hegel's epistemology in this respect.

59 1886; 1889) argued that to simply replace these principles would be a mistake and that what was called for was nothing less than the complete overthrow of the dominance of the transcendentss. That is, a revolution in that would see it transcend the transcendental categories (pardon the pun) of truth and falsity, science and non-science, good and evil. These are the rdols whose twilight was at hand in the coming revolution (Nietzsche, 1889). Such a revolution would be a "prelude to a philosophy of the future".

According to Nielzsche, once the dominance of the transcendent is overthrown and there is a general realisation that 'all truths are illusions', a period of will necessarily follow. Thus, Nietzsche claimed: "nihilism is the necessary consequence of our valuations thus fa/'. But what did Nietzsche mean when he used the word 'nihilism'? The answer can be found in Beyond Good and Evil, where Nietzsche (1886) claimed that nihilism is what ensues when "the uppermost values devaluate themselves". ln Nietzsche's mind, the uppermost value of the modern worldview was reason itself. And it was reason that was in the process of devaluating itself from within. Nietzsche argued that when reason was asked to critique itself it could never justify the Enlightenment presumption that it had the power to disclose such transcendental properties as The Good' or "The True'. Furlhermore, Nietzsche prophesied that this failure would ultimately lead to a realisation that science would never be able to yield anything like absolute knowledge. Faith in reason's ability to yield anything transcendent was entirely misplaced. lndeed, the very idea of a transcendent 'real world' radically devalues everything that has no grounding in the absolutes it postulates. Thus, the end point of the age of reason was destined to be nihilismss.

This conclusion has led many to assume that Nietzsche was a nihilist. However, such an assumption may be premature. Certainly, Nietzsche believed that nihilism was nigh. However, when faced with the advent of nihilism, rather than herald its triumph, Nietzsche's chief concern was with what would eventually replace the dominance of the transcendent. He believed that if the dominance of the transcendent were to be overthrown then we would be saved from the inevitability of nihilism. Thus, Nietzsche (1879) was obsessed with what we could become if we were able to discard lhe "human, all to human" reliance on transcendental legitimation that had characterised occidental history for some 3000 years. All of his talk of "becoming what we are", ol a "higher humanity" or ol 'the death of God and the arrival of the superman" was aimed at defeating nihilism by transcending the transcendent (Nietzsche, 1968)60. What Nietzsche proposed to put in place of the rdols (the transcendental categories)

58 Nietzsche called this position 'inverted ', where 'Platonism' ¡s interpreted broadly as the position that there is a transcendent or super-sensory world that prov¡des meaning to the immanent or sensory one experienced on a daily basis. 5n Such a conclusion is reminiscent of Hume's belief that, if led to its own devices, reason would always result in utter scepticism. uo The idea of the 'ubermensch', directly translated 'overman' (and often 'superman'), is one of Nietzsche's constructive doctrines that were appropriated by Nazi intellectuals to justify the r¡se of National in Germany in the 1930s and 1940s.

60 is something similar to what this thesis calls contextualism. Nietzsche termed it perspectivism. ln Nietzsche's view, the world is a chaotic flux that is full of difference. However, in constructing general concepts, we overlook the fact that no two things (or occurrences) are exactly the same. Consequently, our conceptualising robs reality of its multiplicity (Grenz, 1996). According to Nietzsche,'knowing'is not a transcendental match between objective reality and subjective representation, but a process of imposing enough order and regularity on the chaos of the world as practical needs require6l. Thus, knowing is schematising, or in Nietzsche's words, knowing "involves a great and thorough corruption, falsification, reduction b superticialities and generalisation". ln The Genealogy of Morals, Nietzsche (1887b) clarified his position by invoking the visual metaphor of a 'perspect¡ve', which he claimed highlights something profound about the nature of knowledge. Nietzsche argued that the idea that it is possible to know something from no perspective at all was "an absurdity and a nonsense". For Nietzsclrc, "thought is never external to being", indeed, "our thought is made of the same substratum as everything e/se". That is, it is integrated with 'reality' and is neither the cause nor the measure of it. Therefore, no perspective can exhaust the richness of reality. Accordingly, Nietzsche argued for the perspectival nature of knowledge, claiming that it is impossible to justify our beliefs by reference to some foundational truth62. lndeed, all justification is contextual and dependent upon other beliefs that are held unchallenged for the moment, but themselves only capable of a similar contextual justification. Knowledge is therefore a perspective, an interpretation. Or, in Rorty's (1991b) words: "Knowledge-in-itself, is therefore as impermissible a concept as a thing-in-itself and the categories of reason represent nothing more than the expediency of a ce¡7ain civilisation (the Occìdent): their utility alone is their truth".

Nietzsche's claim that all knowledge is a perspective, and that a perspective is simply an interpretation, denies the Kantian assumption of the uniformity of the transcendental categories of understanding. An interpretation implies some form of personal creative contribution to understanding the world. Furthermore, by introducing the notion of interpretation, Nietzsche associates 'the world' with 'text'. According to Nietzsche, "the world is a text that needs our exegesis". This task is complicated by the fact that the text is ut As an example, Nietzsche considers the relationship of our concept'leaf'to real leaves (N¡etzsche, 1976). Although all leaves may share cedain characteristics, each leaf differs from every other leaf. We can form the concept 'leaf' only by overlooking the difference. Nietzsche held that the concept 'leaf is thus a falsification of the reality of leaves. The term 'leaf introduces into the world an object that is not there, that is 'the form of leaf' (Nietzsche, 1976). This problem is exacerbated when scientists combine such generalisations into what Nietzsche called "great edifices of ideas". The resulting structure is actually an illusion. lt merely repeats on a higher and more complex level the falsification present in each individual generalisation (Grenz, 1996). Accordingly, Nietzsche argues that the laws of nature are not inherent with¡n the real world, but man-made impositions upon the world.

6'l obscure and often full of gaps, by the fact that several 'readings' seem plausible and by the fact that we ourselves are also 'characters' in the text we wish to interpret. Thus, rejecting the dominance of the transcendent means accepting radical corrigibility: we must always remember that our knowledge is uncertain and its justification is only ever relative to other uncedain beliefs. Or in Nietzsche's (1873) words: "facts do not exist, only interpretations".

It has been said that Nietzsche's critique of modern science and philosophy articulates many of the important issues for a science and philosophy of the future, but ultimately leaves the core of the work for the next generation (Nietzsche, 1964; 1976). Ceftainly, the ideas inherent within Nietzsche's writings call into question the entire modern quest for objective, absolute and unchanging knowledge about the world. What we traditionally view as knowledge is actually an interpretation and always involves some sort of creative imposition. lt is this understanding of knowledge that is expanded into a thoroughgoing 'contextual epistemology' by the post-Nietzscheans - those who have taken up Nietzsche's challenge and attempted to continue the work on a science and philosophy of the future. lt is these post-Nietzscheans that have been labelled (either by others or themselves) as postmodern and it is to them that we now turn.

u' ln The Gay Science, Nietzsche (1 887a) also argued for the perspectival nature of perception claiming: "the insect or bird perceives an ent¡rely dilferent world from the one humans do, and the question as to which of these perceptions of the world is more correct is quite meaningless."

62 1.3 Towards a Postmodern Episteme

"Many cultural obseruers agree that the western world is in the midst of change. ln fact, we are apparently experiencing a cultural shift that rivals the innovations that marked the birlh of modernity out of the decay of the middle ages: we are in the transition from the modern to the postmodern era."

- Stanley Grenz

As has already been suggested, much 'sociologic blood' could be spilt over the question of whether we are experiencing the opening of some fundamental fissure as the Western world begins to reject modernity and embrace a new episteme. lndeed, much blood has already been spilt (see Best and Kellner (1991) or Natoli and Hutcheon (1993)). Accordingly, no such attempt shall be made here. One point, however, must be made in reference to the above. That is, the form of such argumentation seems worryingly circular. For instance phrases such as 'fundamental fissure' imply some kind of mind-independent, or objective, transition in society from one 'state' to another. At the same time, however, the new episteme is characterised by a rejection of mind-independence, or objective access in any area of study. Thus, when arguing against objectivity and for corrigibility, many postmodern writers invoke a metanarrative that fails to take account of the subjectivity and corrigibility of the modern / postmodern categorisation itself (and, therefore, commit the performative fallacy). lronically, the rejection of the grand-narrative typically co-exists with a grand-narrative of its own.

Notwithstanding the above, it is undeniable that a postmodern 'intellectual movement' has arisen over the past quarter of a century. lndeed, many authors now specifically characterise themselves as postmodern. Accordingly, it would be remiss not to attempt some brief

description of the breadth and diversity of this movement (Section 1 .3.1). Following this, some of the main ideas raised but often obscured by the arguments of the 'fissure school' are explored. These include the continued, post-Nietzschean, critique of modern epistemology (Section 1.3.2) and contributions from some of the most outspoken self-proclaimed postmodernists on both sides of the Atlantic (Sections 1.3.3 and 1.3.4). Ultimately, however, all of this will be used to suggest implications for science and systems (Section 1.3.5).

1.3.1 Postmodernism Everlrwhere: The Many Faces of Postmodernity

"What's going on just now? What's happening to us? What is this world, this period, this precße moment in which we are living?"

- MichelFoucault

63 What is postmodernism? According to Charles Jencks's (1989) book of the same title, postmodernism was born in St Louis, Missouri, on July 151972 at 3:32pm.

After sacrificing millions of dollars, the Puitt-lgoe housing project in St Louis (a landmark of modern architecture) was razed with dynamite. This event has been hailed within architectural circles as the birth of postmodernism. ln the years that followed, architects slowly moved away from the application of 'mechanical reasoning' to the shaping of space (which resulted in form following function) to new complex forms, drawing motifs from the past without reference to the original purpose or function (Shire, 1997)63. lf Jencks' account of the birth of postmodernism is accepted then it would be easy to form the opinion that postmodernism is simply an architectural movement. However, far from being confined to architecture, self-proclaimed 'postmodernists' are to be found in almost every form of contemporary human activity, including the arts, theatre, film, literature, poetry, journalism, politics, medicine, the information sciences, the human sciences and the natural sciences.

Both Kohler (1977) and Hassan (1993) suggest that the first use of the term'postmodern'in the open literature was Federico de Oniz's Antologie de la Poesia Espanola e Hispanoamericana (de Oniz, 1934). However, both also suggest that the connection between de Oniz's 'postmodernismo' (which consitituted a minor self-critical reaction to modernism) and the contemporary 'postmodern' movement is somewhat tenuous. Perhaps of more interest is the work that started to appear in the 1950s and 1960s in America by writers such as Susan Sontag, Leslie Fiedler, lhab Hassan, Richard Wasson, William Spanos and others. Sontag (1966) attempted to associate postmodernism with the American counter-culture of the 1960s (and to some extent with the'beat'poets of the 1950s). According to Sontag, postmodernism was an aesthetic movement, connected to art, architecture, design and, more broadly, the of culture and society. Sontag's ''was described as:

* ln the 50 years preceding th¡s event, modernist architects throughout the West had developed what came to be known as the international style. As an expression of the wider modernist ethos, this architectural movement was guided by faith in human rationality and the hope of constructing a human utopia (Grenz, 1996). lmbued with modernist utopianism, architects constructed buildings according to the principle of unity. Frank Lloyd Wright (1970) expressed the motivation of the modern architect well when he declared: 'A building should be one great thing instead of a quarrelling collection of many little thíngs". The modern comm¡tment to the principleþf unity produced an architecture characterised by what Charles Jencks calls univalence. Modern buildings display simple, essential forms by allowing one theme to dominate the construction (usually achieved by a device called repetition). Perhaps the most common modern repetitive style is that typified by the nearly universal pattern of glass and steel boxes. As it developed, modern architecture became a universalising movement. lt promoted the program of industrialisation and demoted the variety characteristic within local expression. On the other hand, emerged as a rejection of the universalising ethos of the modernist program. lnstead of the modern ideal of univalence, postmoderns began to celebrate multivalence. Rejecting the austerity of repition, postmodern architects purposely explored incompatibilities of style, form and texture. However, lying behind the rejection of modernist architecture is a deeper principle. Postmodernists claim that all architecture is inherently symbolic and that all buildings (including modern structures) speak a kind of language (Jencks, 1984; 1989). lndeed the postmodernists argue that the architectural accomplishments of modernity were less, expressions of reasons and logic, and more articulations of a language of power. Modern buildings derived their language from the industrial systems they served. The forms and materials gave expression to the fulfilment of Bacon's vision of human's ruling over nature via the application of the scientific method. Conversely, postmodern architects want to abandon this language of power and move away from what they see as the dehumanising uniformity of modernity. ln its place, postmodernists seek to explore new styles that ¡ncorporate the postmodern concepts of diversity and .

64 "extending its medium and means into the world of science and technology, into the popular, and doing away with old distinctions ... [being] totally committed to eclecticism, ranging far and wide across the cultural and scientific landscapes of the twentieth century and recognising no barriers".

As Jurgen Peper (1977) pul il: "aft, science and Technology of behaviour', merge". lndeed, Sontag's postmodern ism "refused to iake añ 'seriously' at all, in the otd sense" (Graff , 1979).

The'aesthetic school'of Sontag (1966), however, was not the only understanding of postmodernism in the sixties; others writing about postmodernism saw it as a form of literature (Fiedler, 1965; 1975) or as a logical successor to the Romantic movement (Graff, 1979), or of having links with the Avant Garde (Hassan, 1971). Of these early '', perhaps Richard Wasson's is most enduring (certainly it is the most valuable for our purposes). Wasson (1969) explicitly identified postmodernism with various intellectual revolts against Enlightenment rationality. To Wasson (1969), postmodernism was not an instinctive aesthetic revolt against traditional artistic boundaries, but a collection of revolutionary ideas. That is, a worldview. Perhaps even an emerging societal-wide worldview. Wasson linked postmodernism with the rejection of Cartesian subject-object dualism and maintained that it was therefore characterised by Tadical ontological doubt". As such, Wasson (1969) introduced postmodernism to the scientific, sociological and philosophic communities.

Following Wasson the idea that postmodernism was not solely associated with American counter-culture gained some currency. Spanos (1972;1976; 1977;1979) went so far as to claim that postmodernism was indeed European in origin, and not American, associating it with Heidegge/s (1962; 1968; 1971) existentialism. Palmer (1977) continued the re-think, arguing that postmodernism was not so much a 'movement' but Something closer to an archaeological shift in the presuppositions of our thinking" (Unfortunately, Palmer failed to make the connection with Nietzsche's "philosophy of the future')64. lncreasingly, postmodernism came to be seen as an emerging worldview, a banner under which all new intellectual movements that could not be classified as 'modern' found refuge. Kohler (1977) expressed the situation as follows:

"Despite persisting controversies as fo what constitutes the characteristic traits of the new era, the term 'postmodern' is now generally applied to all cultural phenomena which have emerged since the second World War and are indicative of a change in sensibility and attitude, making the present age post-the-Modern"

65 Hassan (1980; 1983), expanding on his former 'Avant Garde' characterisation, jumped on this and attempted to map out the dominant figures in the emerging worldview (what he termed the 'new epistemology'). Nietzsche began to be seen as a significant formative figure, following which the post-structuralist critique of structuralism began to be seen as a major postmodern work. Foucault's (1975; 1976; 1984a,b) 'genealogy', Derrida's (1966a,b; 1972,a,b)'deconstruction'and Baudrillard's (1983) 'simulation'were all swept up into the fray65.

Following Hassan's'map'(1980; 1983) the idea that postmodernism is a worldview-in-the- making has dominated the debate. Calinescu (1983) expressed the state of affairs as follows:

"l must begin by admitting that a pluralist renaissance in contemporary thought, as I see it, is as much a phenomena-in-the-making as a desideratum."

Accordingly, attempts at characterisation have been proposed, critiqued, defended and rejected. lndeed, 'postmodernism' is now used by so many people to describe so many aspects of cultural and intellectual life that its meaning has become increasingly fuzzy. Some, such as Rorty, who have used the term in the past, now choose not to, claiming that the term now carries far too much baggage to be a useful appellation. This has raised significant difficulties for postmodern discourse in the 1990s and beyond. These difficulties are augmented by Lyotard's (1979) defining characteristic of postmodernism as "incredulity towards meta-narratives'66.lf taken out of context, as it often is, this definition precludes a precise characterisation of postmodernism - by claiming that any such characterisation can only ever be a metanarrative itself and thus worthy of our incredulity. However, a careful reading of Lyotard suggests that it is not just any narrative, but narratives of legitimation that the postmodernist is incredulous about. Thus, notwithstanding the perrenial difficulties associated with characterisation, there is at least one aspect of postmodernism that is beyond question. Whatever else it may be, as the name suggests, postmodernism signifies the quest to move beyond the narratives of legitimation characteristic of modernismoT. Consequently, an articulation of the postmodern worldview must begin with a characterisation of those grand narratives that it defines itself in opposition to.

ü Although Palmer does quote Heidegger's call for a new episteme that would transcend the pitfalls of Western rationalism. 65 Derrida's 'deconstruction' has been resisted by some who argue that postmodernism should not be solely characterised as a deconstructing worldview but have a positive aspect as well. Calinescu (1983; 1987) prefers to associate postmodernism with pluralism arguing that pluralism moves beyond deconstruction (what he calls negative- monism) and proposes a way forward. " ln 1979 the Conseil des Universities of the government of Quebec requested a report on "knowledge in the most highly developed societies". The task was given to the French philosopher Jean-Francois Lyotard from the Universite de Paris. The repod which presented the results of his investigations was entitled 'The Postmodem Condition: A Report on Knowledge". ln it Lyotard argued that the recent changes in cultural and intellectual life were being underpinned by a revolution in our understanding of knowledge. Lyotard (1979) adopted the term 'postmodernism' to describe the new worldview and defined it as encompassing an: "incredulity towards meta- narratives". ut There has, of course, always been opposition to modernity. Most notably the Romantics and the Pletists. However, the postmodernists argue that whereas these movements rose and fell as competitors to modernity from an early stage ¡n its development, postmodernism represents what has arisen after modernity burnt ¡tself out.

66 1.3.2 Through the Looking Glass: Dissolving Modernist Epistemology

'We need to interpret interpretations more than we need to interpret things."

Montaigne

As we have seen, the modern worldview pioneered by Francis Bacon, Rene Descartes, lsaac Newton and others, during the Enlightenment, underwent two significant assaults in the century and a half leading up to the twentieth century. Kant's Critique has often been described as the final articulation of modernism. However, it is argued here that the hypotheses that '?he mind is active in the knowing process" and that 'The Caftesian 'thinking self' creates rather than inhabits the world", irreversibly changed modern philosophy. No less devastating was Nietzsche's critique of the "dominance of the transcendent" and his claim that scientific knowledge does not embody transcendental truth, but is an "edifice that we have created". Following Nietzsche, much work was needed to articulate the consequences of these twin assaults on the epistemology of modern science. Thinkers such as Dilthy, Saussure and the later Wittgenstein were chief amongst these early 'post-Nietzscheans'.

Following Nietzsche's association of 'the world' (facts) with 'the word' (interpretations), a logical path was to explore the domain of (the interpretation of written texts). The father of modern hermeneutics, Friedrich Schleiermacher (1768-1834) actually predates Nietzsche. Schleiermacher's (1799;1810) work began with the realisation that biblical texts are not systematic theologies. Rather, they are the products of creative minds responding to particular circumstances. Thus in order to understand a text, an interpreter must set it in the context of the life, culture and worldview of the author. Schleiermacher differentiated between two aspects of interpretation: grammatical and psychological. The grammatical approach looks for meaning inherent within the words and phrases of the text, whilst the psychological approach seeks to go beyond the words and uncover the mind of the author. The psychological approach is commonly referred to as reconstruction, since it seeks to reconstruct the text by tracing the process by which it came to be, including the author's personal outlook, life and larger socio-cultural context. Schleiermacher suggested that, of the two approaches, the psychological approach was more likely to yield a deeper understanding of the text (especially in the case of a text originally penned in an alien language and/or alien culture from that of the reader). Unfortunately, Schleiermacher's optimism concerning an interpreter's ability to reconstruct the mind of the author was quickly undermined by conflicting expert reconstructions. Upon reflection, this failure is not unexpected, given that the approach embodied the discredited Kantian assumption that both author and interpreter are manifestations of a single universal human mindset.

67 ln contrast to Schleiermacher, (1833-191 1) denied the Kantian 'transcendental apperception'. According to Dilthey (1976), the'self is linked to the construction of historically and socially conditioned 'worldviews'. Dilthey introduced the German term weltanschauung (which was later adopted by the systems theorist Peter Checkland) to describe the fusion of experience, social custom, tradition, knowledge, value- judgements and principles that make up an individual's worldview. Following this, Dilthey argued that because our experience of the world is finite, our worldviews are never complete or absolutely true. Our understanding is limited by the historical context in which we live and we are thus unable to interpret the past free from the concepts and concerns of the present6s.

The task of understanding texts is f urther complicated by what Dilthey calls the hermeneutical spiral. The problem arises with the realisation that complex wholes and their parts are always inseparably intertwined. Dilthey argued that we can comprehend a whole only by appeal to its parts, but the parts acquire their meaning only within the whole. Grenz (1996) explains the hermeneutical spiral by way of example:

"We can understand the imperative "Hand me my club" only as we grasp the meaning of the individual words. But we can only select the appropriate meaning of 'club' or realise that 'hand' ìs a verb, not a noLtn, when we realise what the entire sentence means".

According to the hermeneutical spiral, we can understand the thinking of individuals only by comprehending the cultural environment in which they lived, but our comprehension of the culture of a given era requires an understanding of the thinking of individuals who lived at the time. The hermeneutical spiral undercuts all hope of finding a transcendent stafting point or a self-evident certainty on which we can construct an edifice of unconditional knowledge of the past6e. Or, in the words of Dilthey:

"Here we encounter the general difficulty of all interpretation. The whole of a work must be understood from individual words and their combination, but full understanding of an individual part presupposes understanding of the whole. This circle is repeated in the relation of an individual work to the mentality and development of its author, and it recurs again in the relation of such an individual work to its literary genre ... so all understanding always remains relative and can never be completed"(Dilthey, 1976).

ut This theme is repeated by Hans-Georg Gadamer (1960), in his Truth and Method, who argued that the interpreter can never recreate the mind of the author, nor understand the past as it was to those living it. According to Gadamer, objectivity in regard to hermeneutics is a fallacy and the only respite from absolute is that lying behind the "Babel of compet¡ng ¡nterpretat¡ons is a shared reality - a world, a language or a tradition". Because of the existence of some common dimension, Gadamer argued that we can anticipate experiencing a "fusion of horizons" such that various interpretations can be compared and contrasted: this fosters a "communion" where we "no longer remain what we were". un As we shall see later, these ideas are paralleled within the philosophy of science with significant implications on the nature of scientific method and mathematical modelling.

68 According to Dilthey, the problems associated with this paft-whole relationship are not limited to hermeneutics, but pervade all aspects of the human world.

Perhaps the first thinker to explore these ideas outside the domain of knowledge of texts (hermeneutics) and specifically in relation to knowledge of the world (natural science) was (1889-1951). Wittgenstein's first major treatise was entitled 'Tractatus Logico-Phitosophicus" (Wittgenstein, 1922)70. ln Tractatus, Wittgenstein replaced the direct critique of knowledge with a critique of language (thus inaugurating the modern analytical school of philosophy). His argument was that the problem of the scope and limits of knowledge was parasitic on the more fundamental problem of the . "Knowledge is expressed in sentences, which have to achieve sense before they can achieve trLtth", he claimed, and thereforei "all philosophy is critique of language". ln the Tractatus, Wittgenstein asserted that the purpose of language was to state facts. That is, language 'mirrors'the world. The 'Mirror Theory' (as it became known) defined the boundary between the 'meaningful' and 'meaningless' by claiming that the relation between language and 'the world' is what gives sentences their 'meaningfulness'. Thus, everything that we can meaningfully say (and by way of implication, everything that we can meaningfully think or know) must be a projection of a possible arrangement of objects in the world. According to Tractatus, the boundary of meaningfulness is drawn very tightly around'empirical'sentences. As such, Tractatus was enthusiastically adopted by the logical positivists of the Vienna Circle who dominated philosophy of science at the time (Becker & Hacker, 1983).

Following the publication of Tractafus Wittgenstein abandoned philosophy and did not return to the fold for almost a decade. However, upon returning, Wittgenstein shocked the entire philosophic world (especially his positivist supporters) by proceeding to elaborate a completely contradictory set of ideas to those developed in Tractatus. The later Wittgenstein's work (found in his posthumously published Philosophical lnvestigafions) denied that the boundary between the 'meaningful' and 'meaningless' can be drawn by reference to any generalset of principles, including those developed in Tractatus (Wíttgenstein, 1953). ln doing so, Wittgenstein refuted the central assertion ol Tractatus, that language has a single purpose. According to the later Wittgenstein, words do not have any 'essential' meaning, nor do they have to 'mirror' the world to be meaningful. lndeed, many descriptive words obtain meaning from their place in an overall sentence structure. This may at first glance look like a small change, however, the repercussions of such an admission called into question the entire set of conclusions of the Tractatus. Language is not an impersonal logical system, able to support an impersonal objective body of knowledge. On the contrary, language

70 Tractatus has been described as a work ol "rare originality"(Dancy and Sosa, 1992). lndeed, it could be argued that Wittgenstein's Tractatus is the foundational te)d of post-positivistic philosophy. lt was the first work to suggest that philosophy of language should become 'first philosophy' and, as such, it inaugurated the linguistic turn of analytic philosophy.

69 requ¡res an anthropomorph¡c examination far more than a logico-reductive one. This change led Wittgenstein to formulate his well-known (anthropomorphic) concept of language games.

According to Wittgenstein's theory of language games, each use of language occurs within a distinct and self-contained system, complete with its own rules. ln this sense, our use of language is similar to playing a game. Two important corollaries of Wittgenstein's theory of language games are:

1. Our various language games colour (or alter) the way in which we experience our world. Thus, the idea of objectivity is a fallacyTl.

2. No proposition can be limited to a single meaning (because its meaning is necessarily dependent on the context, or language game, in which it appears). Hence'truth' is no longer considered a correspondence relation between a sentence and reality, but an internal function of language. We can never claim to be stating the final truth (or truth in any ultimate sense); at most, we can produce utterances that are true within the context in which they are spoken.

These observations are central to a postmodern understanding of science, knowledge and truth. The central question is no longer 'what exists'? Or, 'how do we know'? But, 'how does language (including mathematics) function to construct meaning'? ln other words, there has been a shift from the pre-modern focus on 'being', to the modern focus on 'knowing', and finally, to the postmodern focus on 'constructing meaning'. Accordingly, language has passed from being a transparent, presumedly indifferent, medium of thought, to being a central (and most probably intractable) philosophical problemT2. lt is becoming increasingly understood that what we think, is conditional on the structure of the language in which we think and that no communicable thought is possible independent of language. As Heidegger (1971)statesl

"ln the naming, the things named are called into their thinging. Thinging, they unfold world, in which things abide and so are abiding ones".

71 As part of his critique on modern epistemology, the later Wittgenstein claimed that a philosopher should be someone who "knows nothing and is troubled to the depths of their being by their ignorance". According to Wittgenstein, too few modern epistemologists were disturbed and troubled in this way. Wittgenstein claimed that most modern epistemologists assumed that objective observation was possible and that the current'scientific way' of seeing the world was the 'truthful way'. However, according to Wittgenstein, when we assert truth, we only ever see a partial and incomplete picture, which comes from a dist¡nct perspective. Knowledge, therefore, is always contingent and never absolute. t' Both the postmodernists and the more traditional analytic philosophical communities accept this position. The difference between the two is that whereas the postmodernists claim that the problem of the philosophy of language is intractable in the same way that the problem of epistemology was, the analytical philosophers claim that epistemological questions are finally being formulated correctly (within philosophy of language) and therefore language can provide the foundations for a general theory of knowledge.

70 1.3.3 Attacking the Foundations: Postmodernism and European Post- Structuralism

'There are more things in heaven and eafth Horatio, than are dreamt of in your philosophy."

- William Shakespeare (Hamlet)

ln a similar vein to Wittgenstein, the Swiss linguist (1857-1913) argued that there are no logical reasons why words mean what they do. The most that can be said is that this is how language functions. Or, in linguistic jargon, the bond between the signifier and the signified is arbitrary and hence signs can only be defined in terms of their relationship within the system of language. Saussure argued for what he called '' over the more prominent historical school. According to the historical school, external factors such as geography and migration should be studied in order to come to terms with human linguistic behaviour. Saussure, on the other hand, argued for an ahistorical approach that viewed language as a complete and internally coherent system. Accordingly, Saussure scorned the traditional view that language could be an unmediated representation of something non-linguistic (e.9. a sense-datum), stating:

"lf words stood for pre-existing concepts, they would all have exact equivalents in meaning from one language to the next; but this is not true ... lnstead of pre-existing ideas we find values emanating from the system. When they are said to correspond to concepts, it is understood that the concepts are purely differential and defined not by their positive content but negatively by their relations with other terms of the system. Their most precise characteristic in being what others are not ... Everything that has been said up to this point boils down to this: in language there are only differences. Even more imporiant: a difference generally implies positive terms between which the difference rs set up; but in language there are only differences without positive terms. Whether we take the signified or the signifiers, language has neither ideas nor sounds that existed before the linguistic system, but only conceptual and phonic differences that have issued from the system." (de Saussure, 1916).

According to Saussure, the task of the linguist is to observe the linguistic conventions and relations operative at any given time. For it is only these internal factors that endow the signs of the system with the values they have (Grenz, 1996). Thus he proposed the formation of a new science, semiology (the study of signs).

Saussure's 'structural' school of linguistics understood language as a socially-based, coherent, system. The implications of Saussure's structuralism were that 'truth' could no longer be seen as being constituted in a correspondence between language () and reality (thing). Accordingly, his followers developed alternative conceptions of truth as an internal

71 property of the system of languagett. Here, for the first time, was a discipline (linguistics) reforming its modern presuppositions from within. Accordingly, related disciplines began to follow suit (Best and Kellner, 1991). Following Saussure, intellectuals in a variety of the human sciences (and some in the natural and analytical sciences) began applying similar 'structural' understandings to the phenomena of their own disciplines, thus cementing 'structuralism' as a multi-disciplinary intellectual movement (Sturrock, 1979; 1986). The pioneers of this movement believed that structuralism was an idea whose time had come and that by adopting structuralist modes of thought they were falling into line with the intellectual 'spirit of the age'. This 'spirit' was characterised as a wide spread reaction against the modes of thought that had preceded structuralism (variously described as atomistic, reductionist, empirical or behaviourist). The structuralists argued that there had been far too much focus on gathering what were thought to be 'self-contained facts' and far too little attention to the relationships between these 'facts'. Accordingly, a new wave of 'structuralist' theories and methods arose in several disciplines.

ln philosophy, the German philosophers Husserl (1907; 1913; 1930a,b; 1936) and Heidegger (1962) began to develop'phenomenology', which sought to describe the (structural) 'essence' of consciousness rather than consciousness as it functions in each individual. ln psychology, Gestalt theorists argued that all conscious experience was patterned (Koffka, 19ô2; Kohler, 1947;1969; 1976; Lewin, 1966). Gestalt psychology specifically dealt with 'wholes' and ran counter to the beliefs of the behaviourism of the time, which studied isolated episodes of behaviour in terms of stimulus/response. Kohler (1969) expressed the distinction as follows:

"Our view will be that, instead of reacting to local stimuli by local and mutually independent events, the organism responds to the pattern of stimuli to which it is exposed; and that this answer is a ... functionalwhole".

ln , Levi-Strauss (1963; 1966), began applying structural techniques to the study of mythology, kinship and other anthropological phenomena. Lacan (1977) developed a structural , Althusser (2001) developed a structural Marxism and the Society for General Systems Research (established by Von Bertalanffy, Boulding, Rapoport and others) attempted to apply structuralist modes of thought in the natural and analytical sciences.

The catch-cry of the structuralist revolution was 'holism'. lndeed, the structuralists analysed everything in terms of parts and wholes, defining a'structure'as the inter-relationship of parts within a systemTa. ln Barlhes (1968) words:

73 The structuralist position also undermines a referential (correspondence) theory of 'knowledge' and 'meaning'. /a Structuralism, which is similar to Pepper's 'organicism' is one of the principle metaphysical presuppositions behind the rise of systems theory. Within structuralist theory there exists analogues of every major concept of the early

72 'The aim of allstructuralist activity ... is to reconstitute an object, and by this process, to make known the 'rules of functioning', or 'functions' of this object, which ... brings out something that remained invisible, or, if you like, unintelligible in the natural object."

Despite the initial enthusiasm associated with structuralist activity within the various disciplines that had adopted the new 'holism', a new generation of thinkers, calling themselves 'post-structuralists', began to argue that the structuralist revolution did not go far enough. Specifically, the post-structuralists argued that the focus on the ahistorical structures of the objects of structuralist inquiry involved a naiVe ontological position. According to the post-structuralists, structuralism was right to move away from referential truth, but wrong to insist that the structures of their disciplines were ahistorical. Consequently, the post-structuralists attacked what they saw as the modern 'scientific' pretensions of structuralism (Behler, 1988; Behler & Taueneck, 1991; Belsey, 2002;Boyne, 1990; Caputo, 1997; Culler,1982; Dreyfus & Rabinow, 1983; Hawkes,2003; ldhe, 1998; Silverman & ldhe, 1982). These pretensions led the structuralists to believe in universality and objectivity. lnstead, the post-structuralists favoured a thoroughly contingent, historical (as opposed to ahistorical) view in regard to the development of disciplinary knowledge. lndeed, the post- structuralists demanded that intellectuals in all fields surrender their pretensions to objectivity and accept the subjective, historically contingent and linguistically constructed aspects of disciplinary knowledge. Accordingly, a series of post-structuralist critiques were articulated in what amounted to a devastating attack on structuralist ideology and created an atmosphere of intense theoretical upheaval. These were the heady days of the French intellectual scene of the 1970s and 1980s. The central figures included (amongst others) Foucault, Derrida, Kristeva, Lyotard, Baudrillard and Barthes. Of these, perhaps the most prominent thinkers were Foucault and Derrida. Accordingly, it is to these that we now turn.

1.3.3.1 and Power Relations

"Power produces knowledge ... Power and knowledge directly imply one another ... There is no power relation without the correlative constitution of a field of knowledge, nor any knowledge that does not presuppose and constitute, at the same time, power relations."

- MichelFoucault

system thinkers, including 'system', 'environment', 'components', 'relationships' and 'emergence'. A study of the influence of structuralism on systems thinking would be both feasible and desirable, as would a post-structuralist critique of systems thinking. Whilst this work looks at some of these issues, its scope is set deliberately broader, bringing together an eclectic group of ideas from neo-empiricists, neo-Kantians, phenomenologists, post- Nietzscheans, post-structuralists, pragmatists, radical epistemologists and others. This melange of influences provides several distinct critiques of certa¡n fundamental modern ideologies associated with science and systems thinking. Accordingly, it is collectively labelled as postmodern. Whilst post-structural¡sm is directly related to

73 Michel Foucault (1926-1984) is perhaps the most well known identity of contemporary French thought. Aftercompleting his licence in philosophy (19a8) and psychology (1949) and his agregation in philosophy (1951) he worked in a psychiatric hospitalforthree years before returning to academia. After several years in academic departments in Sweden, Poland and Germany, Foucault returned to France in 1960 to complete his doctorate d'etat in the history of science under . Upon completion of his doctorate, Foucault served as head of philosophy departments at the University of Clermont-Ferrand and the University of Vincennes before being elected to a Professorship in 1970 at the highest academic institution in France, the prestigious College de France. At the College, Foucault chose the title of Professor of the History of Systems of Thought for his chair and remained in this position until his untimely death in 1984. The breadth of his work (as evidenced by the title of his chair) has made it difficult to categorise him. As such, he has been labelled a philosopher, cultural historian, literary analyst, political critic, sociologist and anthropologist by various commentators. His preferred designation, however, was to be known as an archaeologist of knowledge (Grenz, 1996). Whatever the label, Foucault was the quintessential post- Nietzschean.

As we have seen, before Kant, it was thought that knowledge was grounded in lhe a priori 'structure of reality'. However, following Kant's Critique of subject-object dualism, the neo- Kantians (and the phenomenologists) grounded knowledge in the a priori 'structure of thought'. Nietzsche, on the other hand, despaired of the possibility of grounding knowledge in anything transcendental. Neither the mind nor the world provided the sort of unquestionable basis for knowing that the foundationalists desired. Fufthermore, if we continued to hypothesise the existence of some transcendental relation capable of providing certainty, then the unavoidable consequence of such a 'valuation' would be nihilism. Accordingly, Nietzsche elevated the specific and celebrated diversity, arguing that a science and philosophy of the future needed to reject the 'myth' of the 'certain'and the 'general' in favour of the 'truth' of the 'uncertain' and the 'contingent'.

Foucault's work embraced Nietzsche's thought in a way that none had done thus far. Deliberately spurning the idea of uncovering general (or essential) categories of thought, Foucault attempted to uncover the contingent, historical structures that had been at play 'behind the scenes', in the development of the various human sciences. ln so doing, Foucault hoped to show that the self-understanding of these human sciences (as containing some objective body of knowledge) was flawed and that the state-oflhe-art was much more influenced by the underlying ideologies that had gained ascendency in these disciplines than their self-understanding would have. As such, Foucault took the unusual step of studying the historical developmenl of a prioridisciplinary concepts rather than the historical development postmodernism, most commentators see postmodernism as exhibiting a broader range of theoretical, cultural and

74 of disciplinary theories. His areas of application included: psychology and psychiatry (Madness and Civilisation, 1961); medicine (The Birth of the Clìnic, 1963): penology (Discipline and Punish, 197fl and sexology (The History of Sexuality, 1976; 1984a,b).

ln his first book, Madness and Civilisation, Foucault (1961) traced the historical origins of both psychology and psychiatry, claiming that the modern conception that these disciplines have discovered the 'true nature' of madness as 'mental illness'cannot sustain a serious critique. Accordingly, Foucault claimed that the category 'mental illness' is a construction of a psychiatry and psychology in the service of modern society's attempt to control (by exclusion)

those who do not conform to certain basic values of behaviour. There are, Foucault argues, in all societies such 'deviations from the 'that may warrant the appellation 'mad'. However, characterising the'mad'as 'mentally ill' (i.e. suffering some analogue of physical disease) is a peculiar invention of modern western societies. Other societies, such as Renaissance Europe for example, accepted the importance of such deviations (Gutting, 1989)75. However, the modern world (on the alleged authority of psychiatry and psychology) claims that madness has no status beyond that of 'objective mental deficiency'. Foucault questioned the very authority that is at the foundation of this claim, arguing that the development of the disciplines of psychology and psychiatry does not indicate lhe "gradual discovery of the true nature of madness, but simply the sedimentation of what the history of the West has made of it for the last three hundred years".

Following Madness and Civilisation, Foucault (1963) published The Birth of the Clinic, in which he extended his critique of mental illness to physical illness. Like psychiatry, Foucault argued that modern medicine sees itself as based on an objective body of scientific knowledge (in this case allopathic anatomy). ln The Birth of the Clinic, Foucault claimed to demonstrate that, rather than being free from interpretation, modern medicine was based on a highly specific way of 'perceiving' bodies and diseases. Following a similar style of argument to that which he initiated in Madness and Civilisation, he traced the history of the latent structures of medical knowledge through three periods; the Renaissance, the Classical Age (the Enlightenment or the Age of Reason) and the Modern Age (which Foucault defines as beginning with Kant at the end of the Enlightenment)76. Of all of Foucault's books, The Bi¡th of the Clinic is the least contentious. However, in it Foucault calls into question much of the self- understanding of modern medicineTT.

social discourses, of which post-structuralist critiques are a subset. 7s According to Foucault, Renaissance ¡ntellectuals and painters saw madness as something that communicated with the great tragic powers of the world. However, apart from one or two isolated flashes (such as Nietzsche's last messages and Van Gogh's final paintings) the modern view of madness has been that it embodied the absence of reason (Gutting, 1 989). " Whereas most cultural and intellectual historians define the modern era as beginning at the end of the Renaissance (i.e. at the start of the Enlightenment), Foucault differentiates between two post-Renaissance era's: the Classical Age and the Modern Age. As has been mentioned before, this work does not concern itself with arguments about intellectual eras (being primarily concerned with arguments about ideas and their consequences). Accordingly, 1f e traditional (non-Foucault) characterisation of modernity as beginning at the end of the Renaissance is adopted. " Perhaps the widespread acceptance of The Bi¡th of the Clinic can be best explained by the fact that in it Foucault avoids many of the sweeping generalisations made in subsequent books.

75 With his critique of the self-understanding of both modern medicine and modern psychiatry completed, Foucault (1966) embarked upon a comprehensive critique of modern knowledge of human beings in general. This was published in The Order of Things: An Archaeology of the Human Sciences. His central claim in The Order of Things was that all modern knowledge in the human sciences is based upon the aprioriconceptthat human beings area"knowing subject through which there exists a world of objects". This particular conception of human beings Foucault neatly labelled 'man', and claimed that although modern thinkers tend to take the concept of 'man' as definitive, it is but one of many historically constructed ways of viewing humanity (and by way of implication the human sciences). lndeed, Foucault explicitly rejected the concept of 'man' as the ultimate source of knowledge, claiming that rather than language (and by way of implication, knowledge) being constituted in and through 'man', 'man' is constituted in and through language. Thus, Foucault states:

"To all those who still wish to think about man, about his reign or his liberation, to all those who still ask themselves questions about what man is in his essence, to allthose who wish to take him as their starting point in their attempts to reach truth ... to all these warped and twisted forms of reflection we can answer only with a philosophicallaugh."

Here Foucault is back to his polemical best, claiming that the western 'episteme' has undergone two distinct 'breaks': between the Renaissance and the Classical Age; and then between the Classical Age and the Modern Ag"tt. As such, the meaning of the word 'knowledge' has changed with each change of episteme. Furthermore, Foucault claimed that the Western world was entering into a period where a similar epistemic break was occurring: that between the modern and the post-modern.

ln the course of his critique of the human sciences, Foucault came to describe his method of uncovering the historically grounded, contingent concepts behind the scenes of each discipline as 'archaeology'. Accordingly, his next book lhe Archaeology of Knowledge provides an extended account of this'archaeological method'. According to Foucault (1969), "instead of exploring the consciousness / knowledge (connaissance) / science axis (which cannot escape subjectivity), archaeology explores the discursive practice / knowledge (savoi) / science axis". By shifting the concern from 'connaissance' to 'savoir', archaeology operates on a completely different epistemic level from that of traditional history and philosophy of science. Whereas 'connaissance'deals with bodies of scientific knowledge, 'savoir'deals with the conditions of possibility for such knowledge. Thus, from an archaeological point of view, science is just one localised formation on the "epistemological site" that is a discursive formation. Furthermore, the norms of scientific practice found at various 'sites' are not

78 According to Foucault, a given age's conception of knowledge is ultimately grounded in its experience of order, its construal of the nature of signs and its conception of language. Such a set of conceptions, Foucault calls the 'episteme' of a period (Gutting, 1989).

76 unquestionable givens, but the outcome of contingent historico-epistemic processes. Or, in Foucault's words:

"Science (or what is offered as such) is localised in a field of knowledge and plays a role in it. A role that varies according to different discursive formations, and is modified with their mutations".

At the completion ol The Archaeology of Knowledge Foucault remained relatively quiet for about six years until the publication of Disciplìne and Punish (1975) and The History of Sexuality (volume 1, 1976i volume 2, 1984; volume 3, 1984). ln these latter works, Foucault was much more focussed on the role of social and institutional power in the development of knowledge, arguing that bodies of knowledge are not the result of the autonomous pursuit of rationality, but the result of a priori concepts embedded in historically contingent power relations. Whereas Hobbes (1588-1679) asserted "Auctoritas non veritas, facet legeml"and so implicated power in the formation of the law, Foucault argued that power is equally implicated in the formation of disciplinary knowledge.

The later Foucault rejected the Enlightenment view that power is a repressive social force that can only be overcome by the emancipatory light of truth (as revealed through the application of reason). According to Foucault, 'power' is not always a single repressive force flowing down some social or political hierarchy, and reason is never able to remain autonomous from power relationsTe. lnstead, Foucault saw society as Shof through with a muttiplicity of power relations". Thus, bodies of knowledge are always influenced and evolved by systems of powerto. ln associating knowledge with power, Foucault rejected two of the most dominant ideals of modern intellectual life: the idea that knowledge was neutral and objective (a.k.a. Positivism) and the idea that knowledge was fundamentally good and emancipatory (a.k.a. Marxism). lnstead Foucault argued for a new form of study, one that traced the historical development of bodies of knowledge from systems of power. Terming such a study a 'genealogy', Foucault went on to employ the twin techniques of genealogy and archaeology to the study of such diverse areas as biology, economics, penology, sexology and ethics81.

Foucault's work can be said to crystallise around the realisation that Western society has, since the birth of the Enlightenment, made a number of fundamental errors, believing erroneously that:

7e Foucault rejects what he calls 'modernist' theories of power that see power anchored in macrostructures or ruling classes. lnstead Foucault develops what he calls a 'postmodern' perspective of power, which sees it as dispersed, indeterminate and heteromorphous. uo Thus, Foucault adds a moral twist to his position by claiming that every interpretation of reality (or every meta- narrative) is an assertion of power. Because knowledge is embedded in the world, it is always involved in power struggles. Foucault cites psychiatry as an example, which declares that schizophrenics exist, and then views them as the objects of therapy (Foucault, 1 961 Foucault, 1 963). o' ; Whereas archaeology uncovers the contingent metaphysical concepts behind the scenes in the development of all disciplinary knowledge, genealogy uncovers the contingent power relations behind the scenes in the development of all disciplinary knowledge.

77 1. An absolute body of knowledge exists and is waiting to be discovered. 2. The knowledge we do possess is objective, neutral or value-free. 3. The pursuit of knowledge benefits all humankind.

Obviously, Foucault's position undermines any conception of objective science. ln fact, he characterised science as 'ideology' and asserted that, as such, it is entwined with power relationssz. The principle targets of Foucault's later genealogies were the grand narratives of science, sociology and history. According to Foucault, the goal of modern thinkers had been to devise grand narratives for these fields under the theme of progress or emancipation. As such, he claimed that these interpretations only sought to legitimise present structures and mask the '\vill to powef'operative through them.

1.3.3.2 and Deconstruction

'The absence of a transcendental signified extends the domain and the play of significations infinitely."

- Jacques Derrida

Jacques Derrida was born in 1930 in Algeria. Upon completion of his Literature studies at the University of Algiers in 1948, Derrida moved to France to read philosophy under Jean Hyppolite at the Ecole Normale Superieure (ENS), attaining his Memoire (Masters) with a thesis on Meaning, Structi:ure and Genesis in Husserl.ln 1957, after completing a one-year visiting scholarship at Harvard, Derrida returned to France to begin his Doctorate on lhe ldeality of the Literary Object. However, work on this project was abandoned as Derrida became increasingly disillusioned with the inescapable textuality of philosophical writings3. From 1960 to 1964 Derrida taught literature and philosophy at the Sorbonne and began working on the type of problems at the interface of phenomenology, structuralism and that would come to be associated with his entire school of thought. During this time he published his first book, a translation (with a prize winning introduction) of 's The Origins of Geometry. Following his stint at the Sorbonne, Derrida returned to the ENS where he taught for the next 20 years. At the ENS, Derrida became involved with the journal Tel Quel, which sought to promote the emergence of a new criticism specifically opposed to positivist literary theory and open to the new 'sciences' of semiology and structuralism. t' ln The Archaeology of Knowledge, Foucault makes four claims in relation to science and ideology. F¡rst, that ideology is not exclusive of scientificiÇ and that all other discourses have ideological underpinnings. Second, that theoretical contradictions indicate the ideological functioning of a science. Third, that by clarifying or correcting theoretical contradictions science does not necessarily undo its ideological nature (the role of ideology does not diminish as rigour increases). Finally, tackling the ideological functioning of a science is not an exercise in re- legitimation but an acceptance that scientific practice is but one discursive formation among others. - lndeed, it was to be more than two decades later (1980) that Denida finally conducted his thesis defence, based on published writings that included Speech and Phenomena, Writing and Difference and On Grammatology.

78 During the 1950s and early 1960s Derrida's contributions were largely limited to a small clique within the French intellectual community. However, in 1966 Derrida arrived on the internationalscene as a significant force with his paper entitled Structure, Sign and PIay in the Discourse of the Human Sciences, presented at John Hopkins University in the USA. lt was at John Hopkins that Derrida first began to develop many of the ideas that would be clarified and presented in his books to follow. lndeed, many of his most well known positions can be found in this remarkable paper, including the beginnings of his attack on 'logocentrism', early forms of 'deconstruction'and 'differance', a denial of the 'transcendental signified'and a proclamation of the 'free-play' of 'significations'. ln the year that followed his John Hopkins address, Derrida would publish three major books: Speech and Phenomena (Derrida, 1966b), (Derrida, 1966c) and Writing and Difference (Derrida, 1966d), following which he divided his time between teaching at the ENS in Paris and various US universities, including John Hopkins and Yale. During these years, Derrida continued to publish widely and further clarify his positions, resulting in another triumvirate of major works in 1972: Positions (Derrida, 1972a), Dissemination (Derrida, 1972b) and Margins of Philosophy (Derrida, 1972c), followed by the obscure Glas in 1974.

During the mid to late 1970s Derrida was deeply involved with GREPH (Groupe de Recherches sur I'Enseignement Philosophique) a group commissioned to examine the institutional aspects of teaching philosophy. Derrida and his colleagues at GREPH argued that a particular "politics of learning" was implicit in almost all of the classic texts on philosophy and science studies. The GREPH group took it as their mission to deconstruct these texts, and by so doing, call into question many of the basic ideas and beliefs that legitimised institutionalforms of knowledge. At the same time, Derrida's influence began to be felt all over the globe, as his major works began to be translated into English (Speech and Phenomena, 1973i Of Grammatology, 1976; Writing and Difference, 1978; Dissemination, 1981;Margins of Philosophy, 1982; Positions, 1982; Glas, 1986). Following this success, Derrida was invited to coordinate the lnternational College of Philosophy, a Paris-based venture set up to encourage work in areas of interdisciplinary study and research that found no place within the more conventional disciplinary demarcations. He currently holds positions on both sides of the Atlantic as Director of Studies at the Ecole des Hautes Etudes en Sciences Sociales in France and Professor of Humanities at the University of California, USA.

Much of Derrida's work could be construed as continuing a line of thought that begins with Hegel and runs through Nietzsche and Heidegger to the present. Such an archaeology could be characterised by an increasingly radical repudiation of what Nietzsche called 'Platonism' and what Derrida calls'logocentrism'(Rorty, 1991b)84. Derrida defines logocentricity as a way of thinking that attempts to go back to origins, find centres, fix points of reference, certify

* lndeed, in his early works, Derrida's claimed that his prime goal was to "dlyest us of togocentricity".

79 truths, verify an author's intent or locate a text's core meaning. According to Derrida, it assumes a transparency of knowledge and language that is just not there. His most influential books have re-read major figures in western philosophy (from Plato to Nietzsche) to expose their logocentricity, claiming that logocentrism is "utterly peruasive" in Occidental culture. Derrida sees, for example, the influence of traditional dualities such as legitimate/illegitimate, rational/irrational, true/false and science/non-science as all arising from logocentricity.

At the heart of Derrida's critique of logocentricity is a critique of foundationalism in regard to knowledge and in regard to languageuu. His works are provocative (if not downright subversive) in their attempt to ovefthrow what he sees as the Enlightenment ideal of a science and philosophy rooted in transcendental questions. In the absence of logocentrism, Derrida argues that the traditional dualities dissolve. Accordingly, it is the task of the thinker to twist free from these dualisms (and therefore the forms of intellectual and cultural life that they structure). lndeed, Derrida's claims have been so provocative toward Enlightenment rationality that he has found himself embroiled in several controversiess6.

As a result of his attack on logocentrism, Derrida emphasises the fragility of the link between the signifier and the signified, rendering 'meaning' a more elusive concept than the structuralists had supposed. lnstead, Derrida argues that figurative devices are always operative in writing, be it literary, scientific, mathematical or philosophical. Thus, pure thought is never independent from its mode of expression. The use of language can uncover either an underlying belief system that remains unconscious to the writer's intentions (a bit like a Freudian slip) or a hitherto unrecognised rupture in a text's logic. Rather than attempting to find a true meaning or a unified message, Derrida's readings seek to uncover these ruptures in logic (or underlying belief systems), which may unknit its perceived unity (or perceived authority). Such a reading Derrida calls a deconstructive reading. According to Derrida, deconstruction is the seeking out of the tensions between and logic, between what a lexl "means to say" and what it nevertheless is "constrained to mean".

tu As has already been discussed, an epistemic theory can be said to be foundationalist if it attempts to fix points of reference that may be used as a foundation upon which truths may be certified. The classical foundationalist position in regard to knowledge is that advanced by Descartes who argued that certain knowledge can be assured from the indubitable foundations of the 'thinking self' and the application of 'scientific method'. A semiot¡c (or linguistic) theory can be said to be essentialist if it assumes that a system of signs (language) is able to signify (represent) reality in its essential nature. Thus, words (signifiers) can unproblematically communicate meanings present in individual minds (signifieds) such that the listener/reader receives them in the same way as the speaker/writer intended. The classical essentialist treatises are those put forward by the positivists, the early Wittgenstein and Bertrand Russell. 86 Perhaps the most notable of these took place at, thê evei proper, Cambridge University (the same institution where another subversive, the later Wittgenstein, found himself at the centre of a vicious two pronged attack from Bertrand Russell and Karl Popper over half a century earlier). The controversy began with the awarding to Derr¡da of an honorary degree. This seemingly innocuous action sparked a heated debate amongst the Cambridge dons, the result of which was the unusual step of putting the issue to vote. Happily, Derrida won this 'popularity contest' by a majority vote of 336 to 204. The moral of this story is that Derrida's works have been highly divisive, and as a result, highly scrutinised by intellectuals in a variety of disciplines. Derrida has been the subject of more than 450 books. ln the areas of literature and philosophy alone he has been cited more than 14000 times in journal articles. More than 500 English-language theses have been wr¡tten primarily on his work (this is on top of the innumerable French and German theses featuring him as their pr¡mary subject). Truly, Derrida is a giant of the late twentieth century and early twenty first century intellectual scene.

80 Derrida's development of deconstruction as a literary tool is perhaps best seen in contrast to Schleiermacher's reconstruction. Whereas Schleiermacher seeks to reconstruct the text by tracing the process by which it came to be, including the author's personal outlook, life experiences and larger socio-cultural context, Derrida claims that objective reconstruction is a myth and that 'meaning' emerges in a far more subjective manner. According to Derrida, the 'text as unity'tradition is self-delusive because it asks of language something that language cannot provide: to be an unmediated expression of something non-linguistic (e.9. a signified (concept) and, ultimately, a sense datum (thing)). Various attempts to transcend the metaphysical basis of language were performed by the positivists of the early twentieth century until the later Wittgenstein consigned them all to the dustbin of history with his theory of language games. According to Wittgenstein (and also Saussure) language is nothing but differences. 'Red' means what it does only in contrast to 'blue', 'yellow' etc (Wittgenstein & Anscombe, 1977). Derrida agrees, arguing that any attempt to rid ourselves of metaphysics will be expressed in a language that is entirely constructed in and through metaphysics and therefore language itself can be relied on to betray any attempt to transcend it87. Deconstruction uncovers lhe "warring forces of signification at play".

It is safe to say that never before has a literary theory attracted the sort of dread and hysteria that deconstruction has incited since its inception. Much of the criticism rests on the observation that deconstruction does not seem to offer any positive alternative to nihilism. Critics argue that because Derrida claims that nothing means any one thing in particular, he concludes that nothing means anything at all. However, this seems to be an unfair position to ascribe to Derrida. Certainly a growing body of opinion within the more orthodox analytic philosophical tradition holds that the existence of causal relations between language and non- language does not suffice to provide a correspondence between language and reality (Rorty, 1979, 1991a,b, 1998; Davidson,2001). Thus, one could argue that Derrida's claim that'lhe absence of a transcendental signified extends the domain (and the play) of significations infinitely" is no more scandalous than the arguments of the later Wittgenstein, Dummett, Davidson and othersst. Both essentially preserve what was insightful in idealism without suggesting that the material world is a creation of the human mind.

On top of all of this, Derrida applies his idea of deconstruction far and wide, even, it seems, to the world as a whole. Just as a text will be read differently by each reader, so reality will be 'read' differentlyse. According to Derrida, deconstruction works to problematise traditional (modern) thought patterns by showing the impossibility of drawing a firm line between reality and representation. Thus, deconstruction can be seen as subverting the modern ideal of truth tt White and Taket (1997) express this situation thus: "rT we are, say, trapped within language, and we want to express our trappedness, we are unable to do so other than with the vety concepts which trap us". oo Derrida uses the phrase 'transcendental signified' to represent an ent¡ty capable of halting the potentially infinite regress of interpretations of signifiers by other signifiers. 8s Although not explicitly mentioned, Derrida's ideas represent a profound critique of the Cartesian assumption of subjectobject dualism within science.

81 as correspondence to reality. ln fact, Derrida deliberately seeks to undermine all traditional claims to knowledge, arguing that the idea of a single privileged representation is oppressive because it invariably leads to the suppression of all sorts of loose ends that do not fit neatly into the dominant paradigmeo. Thus, by discrediting the idea of a totalising discourse, deconstruction serves to further the postmodern concern for diversity and pluralism.

1.3.4 For the Edification of Us All: Postmodernism and American Pragmatism

'There is nothing good or bad, but thinking makes it so."

- William Shakespeare (Hamlet)

lf post-structuralism is considered primarily a European phenomenon, then its counterpart in America would have to be pragmatism. Not only were the founders of the movement American, but most of its subsequent followers were as well. Whereas, Nietzsche, Heidegger, Derrida and Foucault spearheaded the rise of European post-structuralism, the rise of pragmatism on the other side of the Atlantic is associated with such thinkers as Charles Peirce (1839-1914), William James (1842-1910) and (1859-1952).

At its heart, pragmatism is a belief about the nature of all beliefs. According to Menand (2001), the rise of pragmatism in the late 19rh century is intimately linked to the US civil war. America had just witnessed the most spectacular carnage of its short history. The civil war was a particularly dangerous war for all involved. lt was fought with modern weapons and pre- modern tactics. Moreover, the war itself was fought in the name ideology - a sort of crusade for both sides. The lesson that the pragmatists took from the war is perhaps best expressed by Oliver Wendell Holmes, who claimed that "certitude leads to violence". According to Holmes, '\vhen you know that you know [i.e. are cerTain], persecution comes easy. lt is as well that some of us don't know that we know anything". The generation of Bostonians who fought in the war in their youth, and began to influence intellectual life at Harvard upon their return, picked up on this notion and gradually transformed into a movement.

The early pragmatists claimed that the desire for certainty had led European epistemology (and philosophy of science) into a series of 'dead-ends'. Seeking to find a way around these, the pragmatists attempted to find 'resolution' in 'dissolution'. That is, they attempted to circumvent the pitfalls of epistemology by seeing things in a different light, a light in which

e0 Students of Kierkegaard will recall that this line of argument is reminiscent of Kierkegaard's critique of Hegel.

82 traditional problems (such as the quest for certainty) cease to be problems at all and new, hitherto marginal, issues become central (Moufe, 1996).e1

1.3.4.1 The Rise of American Pragmatism: Charles Peirce, William James and John Dewey

"A great many people think they are thinking when they are merely re-arrangÌng their prejudices."

- William James

The first great pragmatic writer was (1932), whose pragmatic maxim stated:

"lf one can define accurately all the conceivable experimental phenomena which the affirmation or denial of a concept could imply, one will have therein a complete definition of the concept, and there is absolutely nothing more in it."

'Meaning', is therefore a matter of the conceivable experimental consequences of a concept's application. This sounds very much like positivism, and in many ways it is, as Peirce claimed that the role of pragmatism was to highlight lhat "almost every proposition of ontological metaphysics is gibberish'e2. According to Peirce, all "meaningfut thoughts were signs" (representations of objects)s3. ln lhe Fixation of Belief, Peirce (1877) argued that the aim of all inquiry was to eliminate doubt (i.e. 'fix' belief). Among the possible methods for fixing beliefs, Peirce claimed that: "the scientific method was the most successful". As unremarkable as this statement may at first sound, within it lies the seeds of a remarkably different train of thought from that of European science on its road to the positivism that characterised it in the first half of the twentieth century. By choosing to use pragmatic terms such as 'successful' and 'appropriate', Peirce distanced himself from the pre-Nietzschean European obsession with finding a methodological path to certainty. lndeed, Peirce even pre-supposed Popper's famed critique of positivism, claiming that scientific inquiry is characterised by "contrite " and that the scientist is distinguished by "his readiness to dump the whole cartload of his s1 Actually, the method of 'dissolution' is not the sole property of the pragmatists and can be seen in the writings of such predecessors as Kant, Hegel, Kierkegaard and Nietszche as well as contemporaries such as Wittgenstein. lndeed, Wittgenstein (1953) even gave a label to the technique: perspicuity, claiming that perspicuity was needed to see '?hat the most thorny problems with which philosophers were confronted were, in fact, pseudo-problems" and that once these problems could be seen correctly the problems were 'hot seen to be problems at all". lt was not that they were 'solved', but were 'dissolved'. e2 Positivism is an empiricist (or ant¡-metaphysical) philosophy of science that rose to prominence in the early to mid twentieth century.

83 beliefs the moment experience is against them". Thus, whilst Peirce begins mapping out his position in positivisVempiricist terms, he finds the traditional scientific quest for 'truth' and 'certainty' unsolvable, and the human condition as 'fallible'. Accordingly, he claims that the combination of a strict with the scientific quest for transcendental truth is unworkable. Pragmatism, according to Peirce, chooses to retain empiricism, thus necessitating the re{ormulation of such traditional Occidentalconcepts as 'truth'.

According to Peirce, the scientific method is characterised by three different types of reasoning: abduction (the postulating of hypotheses to explain observed phenomena); deduction (the discovery of the logical consequences of these hypotheses); and induction (the verification of these hypotheses through repeated testing). Against Humean scepticism of induction, Peirce claimed that all induction resembles statistical sampling and that such reasoning has a self-correcting character. Although there is no logical reason for relying on induction in the short term, repeated use will lead the community of inquirers towards "fixation of belief" (Peirce, 1877).

Peirce's understanding of induction introduced a subtle variation on the European understanding of scientific truth prior to Nietzsche. Whereas the European position had always been transcendental, in that truth embodied 'correspondence to mind-independent reality', Peirce argued that truth is "that concordance of a statement with the ideal limit towards which endless investigation would tend to bring scientific beliefs". This subtle variation leads Peirce to propose an entirely new theory of truth (see Section 2.3.3.2) and to view science not as the transcendental method of securing transcendental truth, but as the best method we have conceived thus far for achieving consensus on where to fix our beliefs.

Whilst Peirce was no revolutionary, the subtle reforms he made to the entrenched European philosophy of science at the time were to become revolutionary in the hands of his friend and contemporary, William James.

Whereas much of Peirce's work remained unpublished during his lifetime, James became known as the champion of pragmatism, publishing several books and establishing a large student following from his teaching chair at Harvard. According to Menand (2001), James invented pragmatism as a favour to Peirce in a speech at the University of California,

Berkeley, in 1898. lntroducing the term to the world for the f irst time, James told the audience that its source was:

e3 For positivism to succeed in its 'anti-metaphysical' position it required that there be no facts beyond sense experience and therefore all meaningful statements must be reducible to sense experience (i.e. language is a system of sense datum).

84 "A philosopher whose published works ... are no fit expression of his powers. I refer to Mr Charles S. Peirce, with whose very existence as a philosopher I dare say many of you are unacquainted".

James' lecture made pragmatism a subject of intellectual debate across the US (and indeed the world at large) and helped rescue Peirce's thought from the obscurity it seemed destined to. However, whilst James and Peirce shared much in common, there were also significant differences in their worldviews (including their conceptions of truth, science and inquiry). lndeed, if it were not for the fact that James also called his position pragmatism and continually acknowledged his debt to Peirce, it is debatable whether the two would have been thought to be unfolding related strands of thought.

Of the differences between James' version of pragmatism and Peirce's, one of the most telling was their respective position on logic. Whereas Peirce was a pioneer of modern mathematical logic and conceived of philosophy and science as coming fully under its scope, James harboured a deep suspicion of its formalities, claiming that it failed to represent the process of scientific (and philosophic) inquiry, or the structure of scientific (and philosophic) theoriesea. James saw his work as somehow following on from Kant's Critique of Pure Reason (or, critique of formal logic, as he saw it). Whereas Kant tried to set strict limits on formal reasoning, and by so doing make room for such things as values and faith (which he associated with'practical reasoning'), James (1897; 1907) suggested that formal reasoning had very little to do with how we fix beliefs.

According to James, when we think we rarely consult fixed principles. Rather, the act of thinking is often what'fixes'these principles for us. When we reach a belief (or principle) that we are happy with, it is usually because we have worked out that it 'fits' well with the whole inchoate set of assumptions constituting our worldview. These assumptions act as a set of criteria for fixing beliefs, yet they can never be proven to be true by any standard outside the process of fixing beliefs itself. Thus, a strange kind of circularity takes place in fixing beliefs of all kinds. Truth, according to James (1907), happens to an idea. lt is, in effect, the compliment we give to the outcome of our process of fixing belief. James associated this process with what he termed the 'faith ladder' and argued that all forms of inquiry involve 'steps of faith', including what are typically labelled as scientific forms of inquiry.

According to James, pragmatism was firstly a method and secondly a theory of truthes. The 'method' of pragmatism was derived from Peirce's pragmatic maxim, first articulated in his sa lndeed, James was highly sceptical of logic's ability to represent the structure of reality, claiming that the formalities of mathematical logic could never adequately represent the chaos and randomness of the natural world. "u James and Peirce's respective positions on truth will be discussed in Section 2.3. However, for now it will suffice to say that like Peirce, James maintained that truth is something that happens to an idea and not something inherent within it. However, unlike Peirce, James understood truth in teleological terms. That is, 'truth' was connected to 'purpose'. James' radical empiricism led him to suggest that ideas were merely rules for action and that to understand the meaning of an idea was to understand what conduct it was fitted to produce. 'ldeas', for James,

85 paper How to Make our ldeas Clear. ln the hands of James, Peirce's pragmatic maxim attained a radical empiricism. lndeed, James attempted to analyse every statement in terms of statements that explicitly referred to sense experience. Because of this, commentators have suggested that James' commitment to radical empiricism is the core of his entire program. However, it is argued here that behind this commitment lay a deeper motivation for James'thought. That is, to defend pluralism (James, 1909b). lndeed, it is James'conflation of empiricism with pluralism that is perhaps the most significant factor in his defence of the former.

On the face of it, James' radical commitment to empiricism would seem to suggest that he was a positivist, however like his mentor Peirce his empiricism led him in the opposite direction to what he saw as the pretensions of positivism. According to James, empiricist forms of inquiry were characterised by "explaining wholes by paris", whilst rationalist forms were characterised by "explaining parts by wholes". Furthermore, James (1909b) argued that whereas the empiricist inclination to explain wholes in relation to parts led to pluralism, the rationalist inclination to explain parts in relation to wholes led to monism. This conflation of seemingly unrelated binary oppositions (empiricism/rationalism, pluralism/monism, reductionism/holism) is highly idiosyncratic and by no means self-evident. Consequently, it

requires f urther explanation.

According to James (1909b), the question of 'ïhe one and the many" is central to all philosophic problems. James claims that:

"lf you know whether a man is a decided monist or a decided pluralist, you perhaps know more about the rest of his opinions than if you give him any other name ending in 'ist' ... to believe in the one or in the many, that is the classification with the maximum number of consequences".

James relates monism, holism and rationalism (and hence pluralism, reductionism and empiricism) by supposing that the rationalist sees the world like a grammatical sentence (where the whole is more important than the parts). Accordingly, the rationalist is concerned with apprehending this 'whole' (alternatively described as a 'unity', 'order' or 'fundamental law'). Once apprehended, the rationalist explains all observations in the light of this fundamental law. On the other hand, the empiricist conceives order as imposed upon the disorder of sensory observation by the observer selecting objects and tracing relations. According to James "we ca¡ve out order and the world is conceived fhus". Thus, the empiricist is concerned with different parts of the world and, as such, rejects totalising rationalist impositions (i.e. laws of nature).

became 'instruments' and not 'answers'. James' pronouncements on the nature of truth in Pragmatism evoked howls of indignation from Enlightenment scientists and philosophers, most notably Beñrand Russell, who dismissed the entire pragmatist position as "irrational".

86 Whilst agreeing with James' central concern of defending pluralism and rejecting totalising impositions, this thesis suggests that his assumption that empiricism leads to pluralism is mistaken on both logical and historical grounds. lt is beyond the scope of this present work to delve too deeply into this, but for now it should suffice to say that where empiricism has been most dominant (e.9. the positivist episode) monism and reductionism have followed (e.9. the positivist program for the unification of the sciences). Furthermore, reductionism has no privileged relationship with empiricism and can be found in rationalism (Cartesian foundationalism) and empiricism (positivist unificationism) alike. lndeed, if there is a privileged relationship between any of these concepts it is the relationship between reductionism and monism (Matthews, 2004). Unfortunately, James' central task of defending pluralism can be lost in his unique characterisations of the reductionism/holism and empiricism/rationalism dualities. Accordingly, it is this aspect of James'thought that will be the focus of the rest of this overview as it is his defence of pluralism, more than anything else, that has contributed to the emergence of characteristically postmodern forms of thought within the pragmatist philosophic tradition.

At the heart of James' pluralism was his love for the variety he saw in the world and resentment of its dismissal as'mere appearance'. Accordingly, he defied the logic that sewed the world up into what he saw as a spurious unity. ln his A Pluralistic Universe (originally a series of lectures delivered at Oxford), James (1909b) carried out a critique of monism and a subsequent defence of radical pluralism. According to James, 'history' is the key concept separating the monist from the pluralist. Whereas the monist emphasises the 'timeless character' ('nature', or 'order') of things, the pluralist suggests that nothing in the universe is static or eternal or timeless. All disciplinary knowledge is 'historic', in the sense that it is based on the conceptual schemes developed within the disciplinary tradition and not on the'timeless character' or 'true nature' of the objects of its inquiry. James turned to Hegel to understand this 'historic' (or 'contingent') aspect of disciplinary knowledge. Directly quoting Hegel, he argued that there was a dialectical movement at work within the history of disciplinary knowledge. Rather than the disciplines progressing through the application of some transcendental (or statistical in the case of Peirce) method toward'truth', James (1909a) claimed that the history of knowledge is characterised by discontinuities as new and 'more satisfactory'arguments replaced old and increasingly'unsatisfactory'ones.

According to James, Hegel's admission that the subject is part of the universe s/he is attempting to account for implies that the subject will always be situated in a historical contexteo. Reality, according to James, exists in lhe "each-form" raïher than the "all-form".

nu However, James did not agree with Hegel that history would itself provide the key to transcending this historical context. According to Hegel, transcend contingent perspectives in the following manner: suppose you simply affirm A. Such an affirmation is open to supervenience by someone who claims -,4, but B. The only way of making your affirmation self secure is by getting your retaliation in first and stating A in a form that will negate all possible negations in advance. Therefore, what is posited as ,4 must already have cancelled the alternatives. Hegel

87 Thus, defining a "multi-verse" as opposed lo a "uni-verse". According to James (1902), all disciplinary knowledge is historically situated and, as such, what is deemed to be 'true' attains its truth-value with respect to the historical and social context in which the inquirer is operating. James' pluralism is, therefore, supported by his conception of the faith ladder, which he argued was characteristic of the rise of all knowledge. The faith ladder is described as follows:

1. A conception of the world arises in you (no matter how). ls it true? You ask yourself

2. lt might be true if it is not self-contradictory you think.

3. lt may be true here-and-now if it passes some test.

4. It is fit to be true for all time if it would be well to be trueeT

5. It ought to be true.

6. It must be true (something inside you whispers)e8

7. It shall be held to be true (it shall be 'as if'true)

James argued that not one step in this entire process is logical, yet it is the way in which we fix beliefs of all kinds. Accordingly, he rejected the pretensions of the European positivists and claimed that an understanding of the process of fixing scientific belief exposed an "epistemological naiVeté" in the positivist position (i.e. it was impossible to ascribe to the sciences a "God's eye view'). Furthermore, James rejected the modern claim that science had replaced pre-modern superstitions with indubitable facts. According to James, scientists are both enabled and confined by their historical, cultural, social and psychological contexts. Thus, James celebrated what he termed "openness of mind" over "dogmatic arrogance".

Following James, the next great pragmatist thinker was John Dewey. After beginning his long career as a neo-Hegelian and neo-Kantian, Dewey developed a kind of pragmatism that sought to 'naturalise' many of James' themes (Dancy & Sosa, 1992). Like Peirce and James,

maintained that the "linal truth" at the end of historical dialectics would be something to which there was no imaginable alternative, because it contains all possible alternatives inside itself as "moments" already taken account of and overcome. Whilst none could argue that Hegel's vision of a s¡ngle self-securing "truthful whole"was not indeed sublime, James and others have argued (and argued well) that historical dialectics do not work in anything like a Hegelian fashion and that rather than progress toward a single timeless picture, disciplinary knowledge is characterised by discontinuities. James, in particular, thought Hegel's historical vision was sheer'wishful thinking'. " Here James suggests a teleological element to 'truth'. Truth, according to James is a property that happens to an idea when it allows us to get into satisfactory relat¡ons with other ideas. nt Here is the classic 'leap of faith'that James suggests is behind every claim to truth and therefore undermines the monist position.

88 Dewey (1929) rejected all claims to certainty, making this the centre-piece of his book lhe Quesf for Certainty.lndeed, in the opening chapters, Dewey suggested various psychological and sociological explanations for the desire for certainty and proceeded to critique the traditional binary oppositions he argued arose out of this psychological need, including

s u bjecVobject, f acVval u e and th eory/practice.

With Dewey, we see for the first time, pragmat¡sts not only rejecting the appellation 'truth', but suggesting alternatives. According to Dewey, talking about 'truth' served no useful purpose and therefore he began working with the concept of 'warranted assertibility'. Furthermore, Dewey began to reform the concept of 'knowledge', suggesting that knowing is not isolated from practice, but a kind of practice in itself. According to Dewey, knowledge was not a matter of getting reality right, but of developing successful habits of action (Rorty, 1991a). As such, thinking and doing were just two names for the same practice - the practice of making our way as best we can in a universe shot through with contingency. Thus, knowledge should be judged as other practices are - by its purposive success, rather than by some supposed standards of accuracy.

Critics of Dewey have suggested that he confused an account of the application of knowledge with an account of the possession of knowledge. However, according to Dewey's pragmatism, to have knowledge is precisely the ability to practice it. By conceiving of knowledge in this way, Dewey sought to dissolve traditional questions of epistemology. For too long, Dewey argued, epistemologists had made a problem of the relation of the mind to the world (or the subject to the object) in an attempt to provide secure foundations for our knowledge. Dewey's response was to point out that no one had ever made a problem of the relation between the hand and the world. The function of the hand is to help us cope with our environment; in situations where the hand does not work, we try something else, such as a foot. Nobody ever worries about a lack of some pre-ordained fit - they just use a hand where a hand will do (Menand, 2001). ldeas, theories and beliefs, according to Dewey are the same: they are simply instruments for coping. Knowledge, therefore, is not a mirror of something that exists independently of it being known, "it is an instrument, or organ, of successful action" and the chief service of pragmatism, as regards epistemology, would be to give the "coup de grace to representationalism".

Both James and Dewey argued for a radical mutability to what had been hitherto described as 'scientific truth'. The mistake that most people make about scientific theories, according to James, is that they think them true if they mirror the way things really are. However, according to Dewey, mirroring reality is simply not the purpose of having minds.

The criticisms of the pretensions of modern science carried out by James and Dewey have led some to argue that they articulated a parallel critique of Enlightenment rationality to that of

89 Nietzsche. lndeed, Rorty (1991b) sees Nietzsche as the figure who did most to convince European intellectuals of the doctrines which were purveyed to Americans by James and Dewey:

"Nietzsche was as good an anti-Cartesian, anti-representationalist, anti-essentialist as Dewey. He was as devoted to the question 'what difference will this belief make to our conduct?' as Peirce or James. lf all you are interested in is epistemology and philosophy of Ianguage, as opposed moral and , it will not make much difference to your subsequent conduct whether you read Nietzsche or the classical pragmatists".

Whereas post-Nietzschean thought is traditionally said to include figures such as Heidegger, Foucault, Derrida, Lyotard, Baudrillard and others, its counter-part in the US (perhaps post- Jamesian thought?) can be said to include such self-proclaimed pragmatists as Mead, Quine, Lewis, Putnam, Davidson and Rorty.

Whilst it would be simplifying in the extreme to suggest that contemporary pragmatism is synonymous with contemporary poststructuralism, it is reasonable to suggest that both traditions contain complementary ideasee. Both traditions critique the rationalist (Cartesian- subjective) and empiricist (Newtonian-objective) controversy over the foundations of knowledge. Both traditions see this controversy as being perpetuated by the discourse of subject-object dualism and its bedfellows, absolutism, monism and certainty. ln its place, both traditions emphasíse that subjectivity is an irreducible component of knowledge of objects (an epistemic position that has been variously described as constructivism, neo- etc), leading to a commitment to contingency, pluralism and uncertainty. Thus, both the pragmatists and the post-structuralists share a similar distrust of the totalising narratives of legitimation characteristic of the modern era, which is what Lyotard (1979) tells us is the defining characteristic of postmodernism. This line of argument suggests that associating pragmatism with post-structuralism and thence postmodernism is, at least on the face of it, coherent. However, it has been largely thanks to the efforts of Richard Rorty that this association has begun to gain serious currency.

1.3.4.2 Pragmatism Becomes Postmodern: Rorty's and Anti- Foundationalism

"lt is useless to ask whether one rather than another is closer to reality. For different serue different purposes, and there is no such thing as a purpose that is closer to reality than another purpose."

nt lndeed, even labelling the works of Peirce, James, Dewey, Mead, Schiller, Quine, Lewis, Putnam, Davidson and Rorty and others under the general heading of pragmatism is dubious enough. As Schiller (1903; 1907) himself testifies: 'There are as many pragmatisms as there are pragmatists".

90 - Richard Rorty

Richard Rorty is often considered the heir of American pragmatism. Whilst schooled in the Anglo-American analytic tradition, upon commencing his professional life Rorty quickly established himself as one of its most afiiculate critics. Rorty's first major work, an edited book entitled The Linguistic lurn (published in 1967) describes the attempt by analytic philosophy to turn away from 'the mind' as the source of knowledge towards 'language'. However, even at this early stage, Rorty suggested (in the editorial) that the analytic tradition shares a fundamenÌal premise with the tradition that it has devalued. That is, the idea that "there are philosophical truths still waiting to be discovered". Following the publication of The Linguistic Turn, Rorly regularly contributed to the open literature, but did not complete another book for over a decade untilthe publication of his much-heralded Philosophy and the Mirror of Nature (Rorty, 1979). ln this book, Rorty reviewed the various theories of mind and knowledge that had been put forward over the course of philosophic history and argued that Western philosophy's obsession with the mind as a 'mirror' of nature was the source of the modern problematic of finding the 'foundations of knowledge'. Rejecting the 'mirror' metaphor (i.e. rejecting representationalism) led to rejecting foundationalism in regard to knowledge.

By 1982 Rorty had come to associate his peculiar style of anti-foundationalism and anti- representationalism with the American pragmatic tradition and argued as much in The Consequences of Pragmatism (Rorty, 1982). According to Rorty (1991a) the core of the pragmatism of James and Dewey was the attempt to replace the notion of true beliefs as accurate representations of reality with the notion that they are successful rules for action. Thus, knowledge is not a matter of "getting reality right"bulof "acquiring successfulhabits of action for coping with realitf'.lt is not about representing reality and not something that can be said to have solid foundations. This position followed from Dewey (1909), who claimed that knowledge was: "an instrument, or organ, of successful action" and that '\he chief se¡vice of pragmatism, as regards epistemology, will be to give the 'coup de grace' to

rep res e ntat io n al i s m".

After the publications ol Philosophy and the Mirror of Nature and The Consequences of Pragmatism, Rorty's reputation as a postmodern thinker was firmly established (albeit even though he rarely used the term). Within these two books, Rorty articulated many of the 'postmodern' critiques of epistemology that the post-structuralists were expounding in Europe at the same time. Thus, the 1970s and 1980s saw the beginnings of a parallel set of critiques of 'modernity' from both the pragmatic (Rorty) and structuralist (Foucault, Derrida etc) traditions.

91 ln his early works, Rorty acknowledged three thinkers who shaped his positions (and cemented his rebellion against his analytic upbringing) more than any others. These were Dewey, Wittgenstein and Heideggertoo. What impressed Rorty about these thinkers was the way in which they either dissolved or radically changed the shape of traditional philosophic problems as a result of employing new assumptions or vocabularies. According to Rorty, Dewey, Wittgenstein and Heidegger each in their early years tried to find a new way of making philosophy 'foundational'. Dewey attempted to construct a naturalised version of Hegel's vision of objective-knowledge-through-historical-dialectics. Wittgenstein's (1922) Tractatus provided a new theory of representation that was based on the structures of a sense-datum language (and therefore had nothing to do with the traditional object of epistemology: the mind). And Heidegger tried to construct a new set of philosophic categories that had nothing to do with science, epistemology or the Cartesian quest for certainty but found its foundations in 'Being'.

What Rorty respected about each of these thinkers, however, was not the novel theories they afiiculated in their youth, but that each of them eventually came to see these efforts as self- deceptive and in their later work broke free of the conception (inherited from Descartes and Kant) of philosophical thinking as 'foundational'. Dewey abandoned his attempt to make knowledge foundational and began to describe knowledge as 'justified belief' dropping the 'true' from the well known Platonic definition. Accordingly, Dewey began to emphasise the social aspects of justification rather than the empirical aspects of correspondence between the knowing subject and reality. Wittgenstein began to see language as a tool rather than a mirror. Accordingly, he saw the central problem of lhe Tractafus, that of finding the necessary conditions for the possibility of linguistic representation, as self-delusive. ln fact, he described his earlier efforts as: "buzzing around inside a fly-bottle". Finally, Heidegger rejected his existentialist and phenomenological upbringing and began to see the attempt to make the knowing subject the source of necessary truths as one more self-deceptive attempt to substitute a determinate question lor "that openness to strangeness which initially tempted us to begin thinking".

At the heart of Dewey, Wittgenstein and Heidegger's later thought was, Rorty argued, the rejection of knowledge as 'accurate representation' or the mind as a 'mirror of nature'. Following the lead of his philosophic heroes, Rorty (1979) made this the centre piece of his book Philosophy and the Mirror of Nature, claiming:

'"The aim of this book is to undermine the reader's confidence in the mind' as something about which one should have a philosophic view, in 'knowledge' as something about which there ought to be a'theory'and which has'foundations"'.

too The later Rorty would probably add Derrida and Davidson to this company

92 According to Rody, before Kant there was no conception of philosophy as a distinct discipline from science. However, following Kant, philosophy and science became partitioned by the notion that philosophy's goal was to provide a 'theory of knowledge' that would be distinct from the specific knowledge of the sciences because it was their foundation. Thus, philosophy as a discipline attempted to 'underwrite' the specific claims to knowledge made by other disciplines. According to Rofty, the Kantian conception of the role of philosophy assumes that there is a specific 'problem of knowledge'. This idea finds its source in the assumption that knowledge is an accurate representation of mind-independent reality. One of Rorty's major contributions is the suggestion lhat if the traditional way of viewing knowledge is optional, then so is the entire modern epistemological problematicl0l.

Following his rejection of the 'mirror' metaphor, Rorty proposed an alternative characterisation of knowledge to the traditional (modern) one, suggesting that knowledge could be thought of as either:

1.r A relation to propositions (and therefore, justification becomes a relation between the propositions in question and other propositions from which the former may be inferred), or

2. A privileged relation to the objects those propositions are about (and therefore justification is the proof of this relation).

lf knowledge is construed as (1), then there is no special 'problem of knowledge', because there is no need to end the potentially infinite regress of propositions brought forward in defence of other proposiìionsto'. However, if knowledge is construed as (2), then there arises a special 'problem of knowledge' (the epistemological problematic). This problem arises because there would be a justifiable desire to go beyond 'argument', to 'compulsion from the object known' such that argument would be impossible because: "anyone gripped by the object in the required way will be unable to doubt". To reach this point is to reach the foundations of knowledge that the epistemologists so greatly desire.

Because of Rorty's insistence that (1) is a more useful way of construing knowledge than (2), his position on knowledge has been described as 'anti-foundational' and 'anti- representational'. lndeed, Rorty often quotes Quine's (1969) doctrine of the'indeterminacy of reference' to argue that there is no self-evidence involved in attributing meaning to utterances and thence to objects. 'Anti-representationalism' accepts there is no way of finding

tot Rorty argued that the idea of a 'theory of knowledge' grew up from the problems (highlighted by Hume and Kant) of knowing whether our inner representations were accurate. The idea of a discipline devoted to 'the nature, origins and limits of knowledge' - the textbook definition of epistemology - derived from Kant's Critique of Pure Reason. However, if knowledge is no longer defined in representatlonalist terms, the problems highlighted by Kant cease to be oroblems at all. 102 ihis position is similar to what Putnam (1 981) calls the internalist conception of philosophy.

93 an ¡ndependent test of accuracy of representation. Or, as Davidson (1984) puts it, "there is no chance that someone can take up a vantage point for comparing conceptual schemes by temporarily shedding their own". According to Rorty, if knowledge is construed as (2) we need lo "climb out of our minds" in order to justify ourselves. However, if we understand knowledge as (1), it is not judged transcendentally but by the standards of the inquirers of our own day. Accordingly, justification can never be achieved by reference to mind-independent reality but must always be sought by reference to what an individual or community of inquirers already acceptsl03.

With the demise of foundational epistemology various attempts have been made to fill the vacuum. One such attempt has been with the 'linguistic turn' and the subsequent study of the philosophy of language. For example, Dummett (1976; 1978;1981; 1982) sees philosophy of language as foundational in the sense that, following the linguistic turn, epistemological issues are now, at last, being formulated correctly as issues within a theory of meaning. Similarly, some have attempted to ground knowledge in the study of human cognitive practices; the ways people and groups form beliefs. That is, it is transformed into a branch of . Rorty rejects both of these 'new foundationalisms' claiming that the demise of foundational epistemology did not come about due to the modern epistemic program looking for the foundations of knowledge in the wrong places, but due to the failures of foundationalist thinking in its own right. Accordingly, Rorty hopes that 'the cultural space left by the demise of epistemology will not be filled", claiming that the notion that there is an ahistorical, neutral, framework capable of adjudicating knowledge claims is a fallacy. Such a framework assumes that all possible discourses can be translated into it and hence are commensurable and directly comparablel0a.

Much of Rorty's later thought could be characterised as providing an insight into what scientific, philosophic, political and cultural life could possibly look like with the rejection of the mirror metaphor of knowledge (Rorty, 1999). He describes this new outlook as one "in which the demand for foundations rs no longer felt'i arguing that:

'The dominating notion of epistemology is that to be rational, we need to find agreement with other human beings. To construct an epistemology is to find the maximum amount of common ground with others. The assumption that an epistemology can be constructed is the assumption that such common ground exisls. Sometimes this common ground has been imagined to lìe outside us - for example in the world of Being as opposed to that of Becoming, in the Forms which both guide inquiry and are its goal. Sometimes it has been imagined to lie within us, as in the seventeenth century's notion that by understanding our

tot Here Rody suggests that justification is always a matter of coherence with other beliefs. However, these 'other beliefs' are only capable of a similar form of contextual justification. too By 'commensurable', Rorty means "to be brought under a set of rules wh will tell us how rational agreement can be reached or what would settle the issue on evety po¡nt where statements seem to conflict".

94 own minds we should be able to understand the right method for finding truth. Within analytic philosophy, it has often been imagined to lie in language, which was supposed to supply the universal scheme for all possible content. To suggest that there is no such common ground seems to endanger rationality."

Amongst traditionally minded philosophers, Rorty's anti-foundationalism has been met with considerable ire, as it seems to endanger the idea of the philosopher as the guardian of rationality. Rorty understands this and suggests that the role of the philosopher since Plato be re-cast. According to Rorty, the dominant conception of the philosopher is the Platonic 'philosopher-King', whose job it is to oversee all other areas of inquiry and who knows what everyone else is 'really doing' (whether they know it or not) because s/he knows about the ultimate area of inquiry, the foundations of knowledge, in which everyone else's work is grounded. Finding an appropriate characterisation of the foundations of knowledge is the role of this 'philosopher-King', thus providing the spark for the traditional (modern) epistemological problematic. However, Rorty offers an alternative image to that of the philosopher-King. With the demise of foundationalism, the philosopher could become what Rorty has variously described as the 'informed dilettante', the 'multi-lingual', the 'poly-pragmatic' or the 'Socratic intermediary between various discourses'. Under the Socratic intermediary's 'therapy', Rorty sees monistic thinkers "charmed out of their self-enclosed practices and disagreements between disciplines and discourses compromised or transcended in the course of the conversation".

This alternative conception, Rorty labels the 'hermeneutic' approach, as opposed to the 'epistemotogical'approach, to philosophyl05. According to Rorty, the hermeneutic approach sees the relations between various discourses as strands in a conversation that pre-supposes no disciplinary matrix. Whereas agreement is the goal of the epistemological approach, understanding is the goal of the hermeneutic approach. According to the 'epistemologist', agreement is a token of the existence of some antecedently existing common ground that unites human kind in a universal rationality. Thus, agreement is something to be sought after, as it is the best guide we have of closeness to truth. Similarly, to be rational is to: "find the proper set of terms into which all contributions should be translated". The 'hermeneutic', on the other hand, sees agreement as simply 'agreement', neither intrinsically superior nor intrinsically inferior to disagreement and definitely not a token of some transcendental truth. Thus, according to the hermeneutic, agreement has no inherent superiority over "exciting and fruitful disagreement". Furthermore, to be rational is to: "be willing to refraìn from thinking there exists a specialset of terms in which allcontributions to a discourse must be put".

ln his later writings, Rorty swaps the specific labels of 'epistemology' and 'hermeneutics' for the more general 'systematic' philosophy, which is foundational (i.e. epistemology) and

95 'edifying' philosophy, which is anti-foundational (i.e. hermeneutics). ln these writings Rorty argues that whereas systematic philosophy builds towering edifices of ideas aimed at solving the problems of the current generation, edifying philosophy tries to dissolve the problems of the current generation by deconstructing the towering edifice of ideas that created them in the first place. lndeed, Rorty, more than any other pragmatist, has argued that the role of pragmatism is to dissolve difficult problems (such as the nature of truth), thus enabling us to break free from unsuccessful discourses by focussing on issues of more interest. Whereas the vast majority of Western philosophy is systematic, in the sense that it searches for water- tight theories, the work of the American pragmatists Pierce, James and Dewey, and the post- Nietzschean European philosophers Wittgenstein, Heidegger and Derrida is edifying. These edifying thinkers are 'pragmatic' in the sense that they by-pass unsuccessful discourses and 'sceptical' in the sense that they distrust the systematic philosopher's projects of universal commensuration.

According to Rorty, the point of edifying philosophy is to "keep the conversation going". lndeed, he argues that philosophy fades into self-deception when it attempts to do more than send the conversation off in new and interesting directions. Such new directions may seed new sciences and/or new systematic , but they are beside the point. The point is always the same: to perform the social function which Dewey called "breaking the crust of convention".

Rorty's pragmatism has significant implications for the scientific enterprise. Much of his writings can be seen as deconstructing traditional accounts of truth and knowledge, and by way of implication, science's claim over them. Accordingly, Rorty argues that we should give up the idea that the goal of science is to produce models that correspond with reality. lnstead, we should understand that so-called 'scientific' models are but one vocabulary among many. Our question then ought to be: 'trvhich vocabulary works best for our purposes"?

1.3.5 Towards a Postmodern View of Science and Systems

'"To the extent that science does not restrict itself to stating useful regularities and seeks the truth, it is obliged to legitimate the rules of its own game. lt then produces a discourse of Iegitimation with respect to its own status, a discourse called philosophy [of science]. I will use the term modern to designate any science that legitimates itself with reference to a meta- discourse of this kind making an explicit appeal to some grand narrative ... [such as] the Enlightenment narrative, in which the hero of knowledge works toward a good ethico-political end ... Simplifying to the extreme, I define post-modern as incredulity toward meta-

tot BecaLlse Rorty's struggle against the foundationalist idea of universal commensurability is akin to the struggle contemporary hermeneutics has undertaken against reconstruction (Gadamer, 1960).

96 narrat¡ves. This incredulity is undoubtedly a product of progress in the sciences: but that progress, in turn, presupposes it. Consequently, the great narrative function is losing its functors, its great hero, its great dangers, its great voyages, its great goal. The society of the future falls less within the province of a Newtonian anthropology (such as structuralism or systems theory) than a of language pafticles."

- Jean Francois Lyotard ln the preceding chapters we have surveyed two major intellectual shifts. The first shift is described as a shift from the pre-modern (or medieval) worldview to the modern worldview. The development of modernity is largely associated with the Enlightenment and the associated revolutions in science and philosophy. However, initial moves toward modernity can be seen in the Reformation's disenchantment with the medieval reliance on revelation and authority, and the Renaissance re-discovery of classical art, science and literature (notably, the re-discovery of the Greek focus on observation and reason). The new worldview ushered in by these events has been variously described and characterised. ln particular, this work has used the phrase:

Newton's mechanistic universe populated by Descartes'autonomous, rational subject, who is armed with Bacon's scientific method, and hence capable of discovering natures truths and ushering in a new age of progress.

This phrase was chosen, because it contains four key ideas that have come under increasing sceptical attack in recent times. They are:

1. Mechanistic Metaphysics - The dominant metaphysical presupposition of modernity has been 'mechanism'. According to the modern mind, the world is governed by fundamental laws and works in a similar way to that of a machine. The high point of mechanism was undoubtedly the development of Newtonian mechanics. According to Newton, the universe is a determinate machine whose movements can be predicted by fixed, quantifiable and observable laws.

2. Foundationalist Epistemology - Ever since Plato, Occidental thought has been obsessed with truth and certainty. Most attempts at providing the sort of certainty required have revolved around the idea that knowledge needs to have firm foundations. These foundations have been variously sought (recall the controversy between the rationalists and empiricists). What is common to each position, however, is the belief that knowledge (to be knowledge at all) must be an accurate representation of mind-independent reality. Thus, the foundations of knowledge were thought to be secured by the accuracy of representations. Moreover, the accuracy of

97 representations were thought to be secured by keeping subjects (inquirers) separated from their objects of inquiry, and so removing observational error (empiricism) or errors of reasoning (rationalism) in the construction of knowledge of objects. lf human error could be systematically removed, then it was thought that our representations would be indicative of the true nature of the phenomena under study and that this would provide the sort of certainty we craved.

3. Methodological Monism - The focus on subjecVobject dualism and the need to keep observations impartial and reasoning sound led to the modern obsession with 'method'. According to the modern scientist there existed a single 'scientific method' capable of ensuring subjects are never implicated in the construction of knowledge of objects. Whilst it is difficult to maintain that modernity was characterised by agreement on exactly what this method was, there is no difficulty in asserting that the moderns (rationalists and empiricists alike) accepted that 'scientific method' was central to proper inquiry and that any form of inquiry that did not embody 'scientific method' was unable to achieve the cognitive status required of 'meaningful' knowledgeloô.

4. Emancipation through Truth - Perhaps the over-arching motivation of modern thought was the belief in the emancipatory potential of truth. According to the modern intellectual, to be 'modern' was to have rejected the superstitions associated with 'pre-modern' forms of life. These superstitions were thought to be irrational and oppressive. The truth, on the other hand, would set them free from the irrationality and oppressiveness of pre-modern superstitions. Once the 'true nature' or 'governing laws' of all things were known, humankind would know how to live in optimal relationship with their environment (both natural and social) and utopia would dawn.

Modernity has been the single most dominant force in shaping the western world over the past 400 years. lt has pervaded all aspects of intellectual and cultural life and shaped the nature of the physical, human and analytical sciences accordingly. This thesis, however, makes the assertion that modernity is in decline. lndeed, the previous Sections have described a second intellectual shift, which has only recently begun to migrate from the occasional philosophy and literature school into the sciences (natural, human and analytical) and indeed the wider community. Thus far it has been described as a rejection of modernity and as such has been termed postmodern. The rise of postmodernism has been presented in the form of an historical conversation sequence. lmmanuel Kant is seen as starting this conversation whilst Nietzsche (Europe) and James (America) are seen as securing the turn towards postmodernism (see Figure 2 below). tou Here we see the great paradox of the modern commitment to method. Whilst no single account of scientific method ever gained predominance for long, there was a general acceptance that to be meaningful, knowledge must be obtained through 'scientific method'

98 Rat¡onalists Empiricists Descartes, Hume, Locke, Leibnitz etc Berkeley etc (D O

lmmanuel Kant

All knowledge is infused with a priori concepts of understanding.

Georg Hegel

A priori concepts are not transcendental. They are historical!

Friedrich N¡etzsche William James

Human knowledge does not Truth happens to an ¡dea. lt ¡s the progress closer to truth. New compliment we give to the outcome theories are new perspect¡ves. of our process of fixing belief. \ Post-N¡etzscheans Early Pragmat¡sts (Wittgenstein, Gadamer, Heidegger etc) (Dewey, Schiller, Mead etc)

Language cannot provide the Knowledge is not a matter of gettíng foundations for knowledge. Therefore, reality right but of developing successful knowledge is without foundations. habits of action.

Post-Structuralists Neo-Pragmatists (Foucault, Denida, Lyotard etc) (Rorty etc)

Truth claims are assertions of power Philosophy cannot legìtimate knowledge (Foucault). Deconstruction seeks to claims. Therefore, the role of philosophy is delegitimate these assertions (Derrida). The to dissolve current problems by postmodern is incredulous to all legitimation deconstructing the ideas that created them narratives (Lyotard). in the first place. Thus enabling us to break free from unsuccessful discoußes.

POSTMODERNISM

Figure 2: A Map of The Historical Conversation Sequence of the Epistemic Story of Postmodernism

99 Scholars continue to disagree as to what exactly postmodernism is. However, there seems to be consensus on one point. Postmodernism marks the end of the reign of the modern worldview in both intellectualand cultural life.

Whereas modernity was committed to mechanism, foundationalism, monism and truth, postmodernism is committed to an entirely different set of ideas, including:

1. Contextualist Metaphysics - Whereas the modern intellectual maintained that the world is a machine governed by universal laws, the postmodern argues that the machine metaphor has run its course and is increasingly unhelpful in understanding the world around us. This thesis suggests that 'contextualism' (the metaphysics of inter-connectedness) is an appropriate replacement. Rather than governing laws, the contextualist seeks organising frameworks (or contexts). The development of these contexts is what produces our understandings of the world and therefore Newtonian mechanics ceases to be the fundamental description of the world and becomes just one of many possible contexts for understanding the physical world (alongside Einstein's general relativity and Bohr's quantum mechanics).

2. Anti-Foundationalist Epistemology - Whereas the modern intellectual was totally committed to truth and certainty, the postmodern argues that such aims misunderstand the nature of Kant's Copernican shift. Furthermore, the postmodern argues that once Kant's critique has been understood, we inevitably drift towards Nietzsche (perhaps with a minor excursion through Hegel's, Kierkegaard's or the Romantic movement's thought). ln Nietzsche, the sceptical attack on modern foundationalist epistemology reaches a crescendo. The Kantian categories in which we think and know are reduced to malleable, contingent structures, passed down to us through our cultural or intellectual heritage. That is, they are thoroughly contingent, overwhelmingly fallible and totally unavoidable. Moreover, we can never fully justify our knowledge claims (at least not in the foundationalist-representationalist way that the moderns demand) because they are always infused with a priori concepts of understanding. As such, knowledge should not be thought of as an accurate representation of reality or of something that can be said to have firm foundations.

3. Methodological Pluralism - One of the enduring weaknesses of the modern position was the inability of philosophy of science to settle on an appropriate method for partitioning science from non-science. Methodological disputes have been par for the course throughout the modern period. However, with the rejection of subjecVobject dualism and the quest for certainty, this methodological problematic loses much of its significance. lnquiry is released from the dogmatic monism that it

100 aspired to, into a pluralism that, to some extent, had always characterised it. Accordingly, knowledge is not just contingent on epistemological grounds but on methodological ones as well (this thesis explores this in much more detail in Chapters 2.1 and2.2).

4. Emancipation through Deconstruction - Modern intellectuals assumed that it was possible to know the truth and that the truth would set them free. Knowledge, therefore, was thought to be inherently good and was pursued in order to benefit society at large. However, in opposition to these assumptions, postmoderns (because of their focus on the contingent and contextual nature of knowledge) do not accept that it is possible to know the truth and, as such, claim that every assertion of truth is an illegitimate assertion of power. Rather than being inherently emancipatory, truth claims are inherently oppressive. They attempt to fix systems of thought and close debate and never acknowledge their own contextual underpinnings. The postmoderns, therefore, see a need to be emancipated from the totalising impositions of these truth claims. This is where deconstruction comes in. By surfacing the inherent assumptions in modern claims lo truth, the postmodernist delegitimates them. Moreover, by arming individuals with the ability to see the contingent, historical nature of metaphysical concepts, disciplinary knowledge and social structures, postmodern thought allows for them to deconstruct the historical legacies that have been thrust upon them by virtue of their being born in a particular place at a particular time. And inasmuch as it delegitimates the totalising narratives of modernity, postmodern thought may just help to initiate broad societal change.

Unfortunately, the postmodern philosophical program to date has largely been a negative one, concerning itself with the overthrow of modernity and its particular legitimation narratives (i.e. its pretensions to objectivity and truth), rather than setting up a positive philosophy of its own. Because of this, some have claimed that postmodernism offers nothing constructive at all and is simply unbridled scepticism (Sim,2001). However, it is argued here that postmodernism could and should have a positive philosophical program. lndeed, elements of this can be seen already in the writings of Lyotard, Foucault, Derrida and Rorty. What makes the emerging worldview postmodern is not the demise of the narratives of legitimation characteristic of modernity, but the rejection of the whole idea of a narrative of legitimation!

The implications of postmodernism on science are not well understood. As we shallsee (Part 2), the modern claim that science had replaced pre-modern , narratives and metaphysics, with empirical facts and verifiable theories has been progressively undermined. Accordingly, some now claim that science is merely the sacred superstition of the modern world (Feyerabend, 1975). Others have argued that the death of modernity ultimately means

101 the death of science - there is no such thing as a universal law of nature, nor is there any privileged method of inquiry or privileged representation of the world (Lyotard, 1979). lndeed, what is typically taken to be an independently existing object is, in important senses, subject constructed.

Whilst this thesis acknowledges many of the critiques of 'Enlightenment' science made by Hume, Kant, Nietzsche, Heidegger and others, as well as many of the postmodern critiques that have followed, it does not argue for a descent into anarchy, nor, for that matter, the death of science. lndeed, it contends that, rather than being seen as the enemies of science, the post-Nietzschean postmoderns should be seen as its reformers. Thus, postmodernism, should and could have an enormous impact on both science and systems thinking1o7. lt challenges all attempts to unify the sciences, either by a hierarchical model (Section 2.1.2), a single scientific vocabulary (Section 2.1.3) or a single scientific method (Section 2.1.4). Furthermore, postmodern thought calls into question traditional accounts of the nature of science (Part 2) and systems (Part 3).

The impact of postmodernism on science is not, however, limited to highlighting the naiveties of modern science. Postmodernism could and should have a positive scientific program as much as it should have a positive philosophic program. ln this thesis it is argued that the systems approach is well positioned to become the positive program that postmodern science currently lacks. lndeed, it is contended that the founders of the various 'systems approaches' were motivated by a general disillusionment with the dominant metaphysical presuppositions of modern science. This disillusionment was nearly always described in terms of a mistrust of 'reductionism' and 'analysis' and a call to 'holism' and 'synthesis'. However, it is contended that whilst the pioneers of the early systems approaches felt the 'itch', they did not know exactly 'where to scratch'. Thus, much of the literature focussed on the overthrow of mechanism and little of it on the epistemic self-understanding of inquiry. The systems theorists, therefore, largely (and uncritically) associated themselves with modernity: simply replacing one meta-narrative (mechanistic reductionism) with another (systemic holism).

Whilst the critique of mechanism has been laudable, in order for the systems approach to not just become another competing (and flawed) meta-narrative of legitimation it will need to combine this critique with corresponding critiques of the three other modern positions highlighted in this section, namely the belief in foundationalist epistemology, methodological monism, and emancipation through truth.

tot lndeed, the scientific enterprise has already begun to be reshaped by postmodernism. lnstead of a single scientific enterprise subdivided into well-defined disciplines, the science of today is characterised by ill-defined and constantly shifting areas of inquiry. Each of these specialities boasts its own 'language game' and frequently conducts its work without recourse to a universal scientific 'meta-language' or a set of authoritative methodological principles.

102 Paft 2: Science

'"This much is ce¡'tain, that whoever has once tasted critique will be ever after disgusted with all dogmatic twaddle with which he was hitherto contented."

- lmmanuelKant

After tasting 'critique' in the form of Kant's Critique of Pure Reason and its appropriation by Nietzsche, James and the postmoderns, it may be appropriate to revisit some 'dogmatic twaddle' with which Occidental thought has been hitherto contented. As such, Part Two aims to revisit such basic concepts as reductionism, science, and truth.

Thus far it has been argued that the traditional narratives of legitimation of modernity are flawed on epistemic grounds. Notwithstanding this, a concerted effort has been undertaken to salvage these narratives, defend them from sceptical attack and restore to them the pre- eminence that they once enjoyed. This is the program of philosophy of science, with its emphasis on partitioning legitimate 'scientific' inquiry from illegitimate 'unscientific' inquiry'. At its core, philosophy of science is a discourse of legitimation. lndeed, it is the legitimation story of the modern era. The story charts the adventures of its hero, science, who having defeated its great nemesis, superstition, is now advancing towards knowledge of the true nature of the world around us. Such a story stands in direct contradiction to the postmodern incredulity towards metanarratives. lf valid, it could provide the metanarrative of legitimation that modernity seeks. lt would become, in short, the latest, greatest story ever told. ln what follows, various aspects of this story are unfolded. lt is argued that the quest for an ahistorical, unrevisable framework of inquiry (i.e. the scientific method) through which inquirers may pursue an ahistorical, unrevisable truth of the matter (i.e. scientific truth) has been spectacularly unsuccessful. As such, philosophy of science has proved unable to live up to the pretensions of its own metanarrative. As Lyotard (1979) states: "science ... ís incapable of legitimating itself, as speculation assumed it could". However, paradoxically, in finding reasons why it could not fulfil its great hope, philosophy of science may just have cleared a path through the scientism of modernity and towards a post-scientistic postmoderníty.

103 2.1 Scientism and Reductionism

"Scientism is one of the most dangerous contemporary intellectualtendencies."

-

Given the subject of this thesis (systems), perhaps a good place to start our discussion of science is with the cognitive attitude, so common to the sciences, yet so roundly criticised by the systems theorists - reductionism. As shall become evident, reductionism is an ideology about how scientific inquiry may obtain its legitimation. ln Part One, the idea of finding an all-encompassing basis of legitimation for knowledge (i.e. epistemic foundationalism) was criticised. Here, the critique of foundationalism is continued, by discussing the illegitimate use of reductionist ideology to provide these foundations. However, before deconstructing the role of reductionism in science, a constructive contribution in terms of its definition will need to be put forward. This constructive contribution makes a distinction between reduction (an epistemic operation) and reductionism (a scientistic ideology).

2.1.1 From Reduction to Reductionism

"Reductionism is the belief that each meaningful statement is equivalent to some logical construct upon terms which refer to immediate experience."

- Willard van Ormond Quine

The problems associated with reduction have been well documented, as both the term and philosophical discourse on it have been around for centuries (Stoeckler, 1991). Mario Bunge (1990) in his paper The Power and Limits of Reduction categorises reduction as "an epistemic operation" and describes the nature of the operation by stating that:

'We take it that to reduce A to B is to identify A with B, or to include A in B, or to assert that every A is either an aggregate, a combination or an average of Bs. lt is to assert that, although A and B may appear to be very different from one another, they are actually the same, or that A is a species of the genus B, or that every A results somehow from Bs, or, put more vaguely, that A boils down to B, or that in the last analysis all As are Bs."

The reduction of one entity, A, to a combination of 'more fundamental' entities, Bs, reduces the significance of the former entity in favour of the latter entities. Accordingly, reductions are rarely accepted without heated debate. Historic examples of such debates include, but are not limited to, the debates concerning the reduction of the mental (i.e. consciousness) to the

104 physical (the mind-body problem), of thought to the perceived (the rationalist-empiricist dilemma), of language to reality (Plato's identification of rôeuwith Quor( and Wittgenstein's reduction of language to sense-datum), of thermal to mechanical (the kinetic theory of gases), of the living to the non-living (the primordial swamp), of chemical to the physical (e.9. Kemeny and Oppenheim's (1956) assertion that chemical substances can be reduced to atomic physics) and of the mathematicalto the logical (Russell and Whitehead's (1910; 1912; 1913) ). Some of these debates seem to be settled for a while, only to break out once more with all the enthusiasm and vigour that accompanied the previous iteration108.

ln contrast to the long history of philosophical discourse surrounding the term reduction, the term reductionism is a relatively recent arrival on the philosophic landscape. Looking for a descriptive label for one of his ïvvo Dogmas of Empiricism, the philosopher Willard van Ormond Quine (1953a) created the term to refer lo "the belief that each meaningful statement is equivalent to some logical construct upon terms which reÍer to immediate experience". Or, that every idea must be reducible to language and thence sense experiencel0s. The foci of Quine's critique were the logical positivists, and, in particular, his mentor Rudolph Carnap, whose audacious project, Aufbau, attempted to reduce the entirety of scientific discourse, statement by statement, into a sense-datum languaget'o. Quine's (1953a) Two Dogmas described how ideology functions to motivate research projects such as Carnap's Aufbau.ln particular, it presented 'reductionism' as one of the two ideologies (or dogmas) motivating the positivist school.

The relationship, however, between Quine's reductionism and the much-discussed reduction is left uncharacterised in Two Dogmas. Accordingly, it is the intention of this Chapter to first characterise this relationship, second, generalise it and, third, demonstrate how reductionist ideology functions to promote proto-scientific speculatory pursuits such as the various attempts to unify the sciences, consolidate scientific theories (into grand unifying theories) and partition science from non-science.

Unfortunately, despite the depth of discourse on various instances of attempted reductions, the relationship between reduction and the new ideology of reductionism has been remarkably ill-defined. One of the reasons for this is that whereas reduction is a general epistemic operation (applied to ontological, theoretical and methodological entities), Quine's

108 Currently, there seems to be a consensus that it is not useful to consider electromagnetic phenomena as mechanical in nature (Hoyningen-Huene, 1990). Similarly, since Godel's incompleteness theorem, few have argued for a logical basis of all mathematics (Dummett, 1978). However, notwithstanding the current agreement in regard to these two, most other examples of reduction still cause significant debate. ton ln the positivist jargon, the dogma is defined as '1he bet¡ef that a term, to be significant at all, must be either a name of a sense datum or a compound of such names or an abbreviation of such a compound". ttolndeed Carnap's language was not entirely a sense-datum language in its purest sense. He included the notations of logic, up through higher set theory and eventually the entirety of pure mathematics. lt was this project that led the logical positivist to claim that: "i¿ is mathematÌcal representation alone that makes possible the notion of objective knowledge". Nelson Goodman's (1951) 'The Structure of Appearance"showed convincingly that Carnap's efforts in Aufbau not only failed, but failed spectacularly. For a more detailed discussion of the logical positivist school, the reader is directed to Sect¡on 2.2.2.

105 'reductionism' is a specific scientific prejudice (common amongst empiricists), demanding the reduction of every scientific statement to a logical construction of signifierslll. ln particular, empiricism demands the reduction of thought to the perceived, denigrating all statements not reducible to observation as meaningless. The question that immediately arises is: what is the difference between Quine's reductionist ideotogy and classicat attempts to reduce thought to perception? The answer lies in the empiricist's insistence on using this epistemic operation as the criteria for granting thought the same status as perception. That is, empiricism uses the reduction operation as the criteria for cognitive respectability (or legitimacy). At this instance, the empiricist ceases to talk about a single epistemic operation and begins to promote an ideology. The relationship between reduction and reductionism can therefore be summarised as follows: the (reductionist) ídeology is based on the use of the (reduction) operation as the criteria for cognitive respectability (or legitimacy). Whereas Quine introduced the term to describe the dogma associated with demanding the reduction of thought to the perceived, it is generalised here by using the term to describe the dogmatic nature of demanding any reduction as the criteria for cognitive respectability. Thus, the following definitions are suggested:

Reduction: A reduction is an epistemic operation between two entities, A and B, which eliminates the difference between them. Therefore, to reduce A to B is to state that A is (in some way) composed entirely of Bs.

Reductionism: Reductionism is an ideology that suggests that the legitimacy of one domain (A) is dependent on it being reducible to another domain (B).

Accordingly, there are as many forms of reductionism as there are forms of the reduction operation. Of particular interest to the discussion that follows are the reductionisms based on reductions in the ontological, theoretical and methodological domains. These are defined as follows:

Ontological Reduction: An ontological reduction is an epistemic operation between two proposed that reduces one to the other, thereby eliminating the perceived difference between the two. Of particular interest is the ontological reduction of scientific elements between disciplines. For example, the ontological reduction of chemical compounds (the elements of chemistry) to atoms and thence elementary particles (the elements of particle physics).

111 According to Quine, reductionism is an ideology that is used to classify statements as either meaningful or meaningless. As Quine rightly highlights, this dogma was one of the cornerstones of , which claims that speculative assertions not reducible to observations are meaningless and therefore should be eliminated from scrence.

106 Ontological Reductionism: Ontological reductionism, on the other hand, is the ideology that states that an ontological reduction legitimates the ontologies of 'higher' disciplines. This is often expressed as inclusion in the category 'science', whereby the ontological reductionist demands that, for a discipline to be deemed 'scientific', its elements must be ontologically identical (in the sense that they can be reduced to) the elements of the most fundamental disciplinel12. For example, to be deemed scientific, psychological entities must be deemed to be composed entirely of physical entities. By implication, the ontological reductionist assumes that the world is atomistic. lf there exists a field of study whose elements are not reducible to those of the most fundamental discipline, then according to the ontological reductionist, the inquirers of that field are studying things that do not exist.

Theory Reduction: A theory reduction is an epistemic operation between two theories that reduces one to the other, thereby eliminating the perceived difference between the two. Of particular interesl are theory reductions that reduce all of the descriptive concepts and governing laws from one discipline into those of another. ln most cases the elimination of the difference occurs by a set of reductive definitions, which define all of the concepts and laws of the 'higher' discipline in terms of those in the 'more fundamental' discipline. For example, the reduction of the concept 1emperature' to a function of pressure, volume, and the total number of atoms (via Kinetic Theory)113.

Theoretical Reductionism: Theoretical reductionism, on the other hand, is the ideology that states that a theoretical reduction legitimates the theories (and concepts) of 'higher' disciplines. ln order to be legitimate, the concepts and laws of a given field of study must be reducible to the concepts and laws of a more 'fundamental' discipline and thence to the most 'fundamental' discipline. This is perhaps the most contentious of the reductionisms. As with ontological reductionism, it is often expressed in terms of the demarcation of science from non-science, stating that to be deemed 'scientific' the concepts and laws of one discipline must be derivable from (or reducible to) the concepts and laws of the most fundamental discipline. Thus, for example, the laws of biology must be derivable from the laws of physics. Thís ideology is often associated with Descartes'analytic method, which states that: 'ã term is scientifically analysable if and only if it is reducible to primitive terms by a chain of definitions" (Descartes, 1637).

Methodological Reduction: A methodological reduction is an epistemic operation between two methodologies that reduces one to the other, thereby eliminating the perceived difference between the two. Of particular interest are the methodological reductions that make use of the concept of a meta-methodology - a governing framework of inquiry that both methodologies

112 The most common form of ontological reductionism is : the specific form, which grants the objects of study within physics as the fundamental elements. A corollary of this position is that the study of these fundamental elements should yield knowledge of the entities of all other disciplines. 113 PV=nkT

107 can be translated into. For example, it is often assumed that the different modes of inquiry involved in particle physics, organic chemistry, clinical psychology, natural ecology, applied mathematics and astronomy can be translated into a meta-methodology known as 1he scientific method'.

Methodological Reductionism: Methodological reductionism, on the other hand, is the ideology that states that a methodological reduction legitimates the methods of inquiry of 'higher' disciplines. As with ontological and theoretical reductionism, methodological reductionism is often used as a criterion of scientificity. According to the methodological reductionist, to be 'scientific' the process of inquiry of a given discipline must be meta- methodologically identical to the processes of inquiry of the most fundamental discipline. Unfortunately, methodological reductionism seems to be almost ubiquitous amongst scientists, most of whom assume that if a 'higher' discipline (e.9. management science) cannot reproduce the same process of inquiry as the most fundamental discipline (often thought to be physics), then the higher disciple must not be given the esteem associated with being labelled a 'science'. This ideology is typically associated with the concept of 'the scientific method', which was first proposed by Bacon (1620) but has been the focus of intense debate and re- working ever since.

Having completed the constructive contribution in terms of generalising the concepts of 'reduction' and 'reductionism', as well as suggesting three distinct domains in which these can operate, it is now appropriate to consider the role that reductionist ideology has played in dictating the legitimacy (or illegitimacy) of scientific ontologies, theories and methods in more detail.

2.1.2 Reductio ad Unum: Reduction¡sm and the Unification of the Sciences

"Science works precisely because different points of view illuminate different features of the world."

- lan Stewart

Ontological reductionism is an ideology that seeks to construct a basis of legitimation for scientific ontologies. lt requires that in order for a field of study to be esteemed as 'scientific', its elements must be reducible to those of the most fundamental discipline. That is, they must be ontologically identical to the most fundamental discipline. This immediately raises the question: what is the most fundamental discipline? Contenders have included mathematics

108 (logical positivism)11a, philosophy and/or psychology ()115 and physics (physicalism)ttu. Of these contenders, physicalism (specifically that area of physics that deals with elementary particles) enjoys overwhelming support. However, this support is not derived from any specific successes involving the reduction of higher ontologies to the ontologies of elementary particle theory. Rather, it seems that its support is derived from the almost universal acceptance (at least amongst scientists) of the ontological position known as atom¡sm.

Atomism, is the belief that, at the most fundamental level, all objects in the world are composed of the same kind of indivisible substance. This substance is known etymologically as an a-tom (meaning in-divisible). The position can be traced back to at least the time of Democritus, who proposed an atomic theory of nature around 400 B.C. However, it has enjoyed its most significant influence during the Modern era ever since chemists began explaining reactions in terms of the combining and re-combining of elementary particles (atoms) into clusters (or molecules¡t". According to the ontological reductionist the world is self-evidently atomistic (see Carnap (1932) and Carnap et al(1938)). Furthermore, given that it is almost universally accepted that the atoms in question are material substances, contemporary has become synonymous with . Thus, the science that concerns itself with the material properties of these 'atoms', could theoretically account for all of the so-called 'higher disciplines' (which are concerned only with superficial macro- properties at some level).

lf materialistic atomism is the ultimate truth about the way the world is, then particle physics is the ultimate science, capable of telling the complete story of the material world. Everything that happens in the world can theoretically be explained in terms of physical entities and the laws governing their behaviour and interaction. That is, ontological reductionism, together with a corresponding set of theory reductions could unify the apparently disjointed sciences. This is exactly what Oppenheim and Putnam (1958) attempted in their paper Unity of Science as a Working Hypothesis. The paper contains:

114 The logical positivists claimed that all empirical 'posits' remain subjective until objectified by mathematical reoresentation. 1td Absolute idealists (and to some degree, phenomenologists) argue that all phys¡cal entities are, in some sense, mental. 116 Physicalism can be thought of as the oppos¡te of absolute idealism, ¡n that it claims that only physical (or material) entities exist and that all proposed non-physical entities either do not exist or are in some sense composed of physical entities. ttt These basic elements (the elements of the chemist's periodic table) were later understood to be composed of even more basic elements (the physicist's protons, neutrons and electrons). Following a series of experiments, which included J.J. Thompson's 1897 experiment (electron), Ernst Rutherford's 191 1 experiment (proton) and Samuel Chadwick's 1932 experiment (neutron), protons, neutrons and electrons became the new atoms (etymologically speaking). However, more recently, even these elementary particles have been discarded as the real 'atoms' in favour of Gell-Mann's (.1964) quark theory of matter. Current nuclear physics hypothesises the existence of some 200 elementary particles, classified into two main classes, leptons and hadrons. Whereas leptons are said to have no internal structure, hadrons are thought to have a complex internal structure. Hadrons are said to consist of baryons (which decay into protons) and mesons (which decay into photons). lndeed, current elementary theory does not stop at the existence of baryons and mesons, but also hypothesises the existence of quarks and anti-quarks as the basic building blocks of baryons and mesons. Therefore, according to current doctrine, the true'atoms' are leptons, quarks and anti-quarks. As we shall see, the whole elaborate quark theory of matter is highly contentious and is not always consistent with contemporary quantum mechanical descriptions.

109 1. A well-developed concept of reduction, which is consistent with the reduction operation proposed here.

2. An atomistic ordering of the branches of science as describing distinct levels of reality. The entities on each level are said to be composed of simpler entities at the next level below. Specifically, they suggest the following hierarchy:

SocialGroups Multicellular Organisms Living Cells Molecules Atoms Elementary Particles

3. A program for the unification of science based on the foundation of particle physics, which is described as the most basic of the sciences. The program is completely reductionist in ideology, and proposes the reduction of psychology to biology, biology to chemistry and chemistry to physics. Anything that cannot be so reduced is said to be un-scientificl18.

According to Oppenheim and Putnam (1958), the investigation of each level is the task of a particular scientific discipline, which would aim to discover the 'laws' governing the behaviour of the entities at their level. However, the end point of investigation would be the derivation of all possible objects of inquiry from various structural assemblages of objects at the next level belowttn. That is, the end point of the scientific enterprise is the elimination of the perceived difference between the objects of different disciplines.

Despite the positive reception of Oppenheim and Putnam's paper (especially amongst fellow positivists), a cursory inspection reveals that the proposed ontological hierarchy is, at the very least, an over-simplification. Even if one accepted the physicalist doctrine, science would look less like a uni-dimensional hierarchy and more like a complex web. For example, molecules are said to make both living (biological) and non-living (geological) things, thus necessitating a fork in the supposed hierarchy. Furthermore, within physics itself it is not obvious how to organise gravitation, electromagnetism and nuclear physics as levels of a hierarchy without some yet to be constructed grand unifying theoryl2o. And what is to be made of disciplines

118 Oppenheim and Putnam's (1958) program for the unification of the sciences actually involves all three r d in the previous sect¡on. t whether structural insights will always lead to behavioural insights. t ceorgi proposed the first grand unifying theory (SU-s), linking the electro-weak force with the strong (colour) force. Almost thirty years later, all of the grand unifying theories that have been proposed have been subsequently abandoned and even Georgi himself has shifted his area of research to the accessible energies of particle physics.

110 such as ecology (which combines elements of genetics, animal and plant physiology, behavioural science, geology and meteorology) or economics (which combines elements of psychology, sociology, philosophy and malhematics). Clearly we are not dealing with a hierarchy, but an ever-expanding complex web.

All of the above highlights serious obstacles to the workability of the unification project. However, it does not necessarily challenge physicalism itself. Accordingly, the rest of this discussion will focus on the merits of the very ideology underpinning the unification project - the ontological reductionist commitment to physicalism.

ln the history of scientific thought physicalism has received variable attention. The literature on specific ontological reductions is immense. However, despite its size it is limited to one or two well-known and much debated attempts at an ontological reduction (for example, the debate about the reduction of the mental to the physical: the so-called mind-body problem). Unfortunately, the general issue of physicalism itself is rarely ever addressed. ln fact, most unification projects assume physicalism as self-evident and quickly move onto the structural problems associated with unification.

One advocate who has given consideration to physicalism in its own right is Hartry Field. ln his review of Tarski's theory of truth, Field (1972) claims that physicalism is an "intrinsic and extremely fruitful part of science'i stating that science has made progress by trying to reduce social concepts/entities to biological ones, biological to chemical and chemical to physical, and dismissing as mythical anything that cannot be so reduced. One of the difficulties facing Field's physicalism, however, is that what counts as the most fundamental entity in the eyes of theoretical physicists keeps changing. Atoms have given way to electrons, protons and neutrons, which in turn have given wayto more elementary particles such as quarkst2l. Furthermore, even the issue of what is a physical entity and what is not seems to be a matter of great dispute among theoretical physicists. ln Aristotle's time, bare substratum was thought to be a physical entity, but is no longer. Since then Newton, Bohr, Maxwell and Einstein have all proposed new entities that have caused disputes about physicalilylzz. Clearly, the belief in physicalism is not a result of science, but an a prioripresupposition that informs the direction of the research taken by science.

Whilst the issues of what constitutes a fundamental element, or, for that matter, matter itself casts serious doubts on the self-evidence of the physicalist mantra, perhaps the most serious source of scepticism towards physicalism, in recent times, has arisen from theoretical advancements within particle physics itself. Theoretical difficulties have arisen in identifying

121 lnterestingly, Einstein never accepted quarks. However quarks are not only accepted today, but are said to posses such properties as charm, spin, beauty and truth (Kirkham, 1995). As Kirkham (1995) rightly po¡nts out, if this sort of ontological exotica is acceptable to the phys¡calist, then one really begins to wonder what exactly the physicalist has a^gainst non-physical entities. '" ln many cases, these entities were originally dismissed as qualitiates occultae and not physical ¡n nature at all.

1'11 an elementary particle from one point in time to another. Even at the same time, two of the same elementary particles have been thought to be interchangeable in subtle statistical ways that challenge their individuality (Quine, 1969). lndeed, contemporary quantum mechanics has largely abandoned the concept of elementary particles entirely in favour of point events, or momentary local states123. Thus, making non-atomism a fundamental tenet of quantum theory and discarding the notion that there should be one fundamental description of nature (Bell, 1981 ;1987)124.lndeed, quantum mechanics demands a multitude of context-dependent descriptions and only under special conditions can a description by context-dependent entities such as quarks, electrons, atoms or molecules be allowed (Primas, 1991).

lf there are more things in heaven and eafth than are dreamt of in pafücle physics, then the disunity of science is not merely an unfortunate consequence of history or a reflection of our limited computational capacities. Rather, it reflects the underlying ontological diversity of the world, or, as Dupre (1993) claims: the disorder of things. Accordingly, a position of ontological pluralism is perhaps the most appropriate attitude: there are many kinds of things. Chemical compounds, biological organisms, geological structures, ecological niches, meteorological climate zones, economic booms and busts, psychological traumas, etc all exist in no less metaphysically robust sense than do atoms, electrons or quarksttu. Consequently, science could never come to constitute a single unified project associated with the study of only one kind of (fundamental) entity. Such ontological pluralism has found recent support from a wide variety of scientists and philosophers (Galison and Stump, 1996). lt has even been argued that where pluralism is accepted science has made its greatest advances by avoiding fruitless scientific dead-ends, such as grand-unifying theories (Fine, 1986).

Acceptance of ontological pluralism requires that the typical understanding of scientific objects (as a hierarchy) be replaced with an alternative conceptualisation. Such an alternative could be based upon the metaphysics of contexts suggested in Part One. As we have seen, contextualism is the name given by Pepper to describe the view that the world is an unlimited complex of interconnectedness. Out of this total web we select certain contexts;

123 lnterestingly, at various times this non-atomism seems to be forgotten. For example, in 1974 the particle group at Brookhaven announced the 'discovery' of the so-called J-particle almost simultaneously with the Stanford Linear Accelerator announcing the 'discovery' of the V -parlicle. These discoveries turned out to be observations of the same event, the so-called l// lJ-parlicle (Mass 3098 MeV, Spin 1, Resonance 67 keV, Strangeness 0). To explain this, theoreticians introduced a new entity, the so-called charmed quark. The l// lJ-parlicle was said to be composed of a charmed quark and an anti-charmed quark with their respective spins aligned (Fine, 1986). t'o Bohr argued that classical concepts split into mutually exclusive packages if they are used outside macroscopic physics. To get beyond this a selection must be made as to which package of concepts will be used. Different selections enable illumination of different aspects of the phenomena under study. However, what is revealed will always depend on the point of view chosen and the results of alternative explorations can never be combined into some unified picture (Dupre, 1993). t'u A consequence of ontological pluralism is that the disciplines emanc¡pate from a restrictive reductionist hierarchy. This presupposition has caused numerous debates. For example, in physics, the autonomy of condensed matter physics from the principles of parlicle physics has been debated from laboratories all the way to the halls of Congress. At stake was the future of the superconducting supercollider (SCC). lf other branches of physics follow from the fundamental laws of parlicle physics, then there are compelling reasons for funding the more fundamental areas of research. Not surprisingly, condensed matter physicists argued for the non-reductive autonomy of their field (Galison and Stump, 1996).

112 these contexts serve as organising frameworksor patterns that give meaning and scope to a vast array of detailthat, without the organising pattern, would be meaningless or invisible.

According to the contextualist, the context creates an integrating structure and fuses into unity items that, in other contexts, may appear as discrete entities. Within these contexts meanings emerge in complex strands or levels. Moreover, these meanings would disappear without the organising context. Thus, for the contextualist, the integrating structures (or contexts) are conceptual tools for knowledge generation and explanation and have no independent relationship with reality in their own right. lndeed, most contextualists would deny that the world has one intrinsic structure that can be grasped. Hence, all knowledge is fragmentary, limited and partial and only occurs within the limits of specific contexts. This position, as we have seen, is in line with the post-Kantian discourse of knowledge presented in Part One. A priori concepts govern every observation and description of reality and, as such, the idea that ontology can ever be objectively legitimated is infeasible. All we have to legitimate our ontologies are the pre-suppositions of the context and these, in turn, can only ever be legitimated in a similarly contextual manner. According to Batens (1992) these presuppositions include:

1. The problem. 2. The participants. 3. The contextual certaintiesl26. 4. Methodological rules and heuristics. 5. Relevant truth statements.

Furthermore Batens (1992) argues the following conjectures

1. There is no highest (or more fundamental) context. 2. A contextual certainty, or a relevant truth statement, or a methodological rule of a context C1 may be the problem of a context C2. 3. f may be certain, or relevantly true, in C1 whereas -ï may be certain, or relevantly true in C2. 4. No statement is contextually certain with respect to all problems. 5. All contextual are logical certainties, and all logical certainties are contextual. 6. Meanings vary from one context to another, from one language to another, and often, from one person to another. 7. Communication does not require that people assign the same meaning to words. 8. Some problems are solved in an unconscious manner.

113 A contextual understanding of the relationship between the objects of the various disciplines seems much more promising than a reductionist one. Especially given the fact that the post- Kantian, postmodernist claims that knowledge is, has and always will be context-dependent. Such an understanding rejects the hierarchies of the ontological reductionist and sees the objects of inquiry of the sciences (and the relationships between these objects) as in a state of constant change. As Lyotard (1979) argues:

'"The classical dividing lines between the various fields of science are thus called into question - disciplines disappear, over-lappings occur at the borders between the sciences, and from these new territories are born. The speculative hierarchy of learning gives way to an immanent and, as it were, 'flat' network of areas of inquiry, the respective frontiers of which are in constant flux" (Lyotard, 1979).

According to a contextualist understanding of ontology, various relations exist between the disciplines and there are no sharp boundaries. Accordingly, science is not seen as a uni- dimensional hierarchy but a complex web, the boundaries of which are indeterminatel2T. Moreover, the disciplines (new contexts) come into and out of being as new relationships are formed and old ones fall into disuse. lf the contextualist ontological claim is accepted (i.e. that ontologies are relative to a working context and not determined by the structure of reality), then the reductionist attempt to find the 'true' ontological hierarchy of the world falls into disrepute. Furthermore, this casts serious doubt on corresponding programs to reduce 'higher-order' theories to 'lower-order' ones. However, this is exactly how the theoretical reductionist attempts to legitimate 'higher-order' theories. Accordingly, it is to theoretical reductionism that we now turn.

2.1.3 Dissectio Naturae: Reduction¡sm and The Analytic Method

"All of the natural, social and mixed sciences are faced with micro/macro gaps because all of them study systems of some kind or other ... ln many cases one knows how to solve problems concerning the micro-level or the macro-level in question, but one does not know how to relate them. ln particular, one seldom knows how to account for macro-features in terms of micro-entities and their properties and changes thereof. Consequently the micro- specialists (eg microeconomics) and macro-specialists (eg macroeconomics) outnumber the experts in bridging the gap."

- Mailo Bunge

t'u These are statements, which are relevant to the problem and considered contextually beyond discussion. No contextual certainty is common to all contexts.

114 Theoretical reductionism, as we have seen, is an ideology that seeks to construct a basis of legitimation for scientific theories. lt requires that for a theory to be deemed 'scientific' it must be analytically reducible to other (more fundamental) theories. As such, theoretical reductionism is associated with the use of Descartes (1637; 1641) analytic method as the basis of legitimation for'higher-order' theories.

The analytic method has been variously described (Wilson, 1978; Cottingham, 1988; Flage & Bonnen, 1999; Hatfield,2OO2),. However, at its heaft is the idea of the elimination of a supposedly specific (macro) property by a redefinition that reconstructs that (macro) property as a function of other (micro) properties. The quientessential example is the 'kinetic' redefinition of temperature as a function of pressure, volume and the total number of atoms. Accordingly, the analytic method can be defined as follows:

Consider two theories, T1 and T2. T1 is analytically reducible to T2 if and only if there exists a set of reductive definitions, R, which define all the concepts of T1 in terms of some set of concepts found in T2, and, all relationships among the concepts of T1 can be derived from the relationships among the concepts of T2128.

From the outset, however, the analytic method has been associated with analytic (or theoretical) reductionism. Even the founder and principle proponent of the method, Rene Descartes (1641), could not resist using the technique as a basis of legitimation, stating:

"A term is scientilically analysable if and only if it is reducible to primitive terms by a chain of definitions", and thereby linking 'scientificity' with analytic reduction. ln this Section it is argued that Cartesian reductionism (the belief that the analytic method is the only way of generating legitimate 'scientific' knowledge of macro-entities, their macro-properties and the macro-laws governing them) is an unjustifiable restriction to place on scientific knowledge and, if strictly adhered to, much of what is currently considered scientific would be deemed unscientific. lt is further argued that most proposed analytic reductions do not even meet the conditions inherent in the above definition and require a set of subsidiary assumptions and hypotheses, H in order for the reduction to hold.

The unprecedented success of the analytic method in nearly every domain of knowledge justifies, to a large extent, the high esteem that the approach enjoys. Nevertheless, we must look beneath the surface appeal to uncover some of the limits of this approach. What is undeniable is that the method has procured for us an enormous amount of information of various 'parts' or 'internal layers' of phenomena. However, the question of whether this ttt The argument for an indeterminate demarcation between science and non-science is continued in Section 2.2.

115 information has been able to help us better understand phenomena at the macro-level is by no means easily answered as there are numerous examples whereby attempted analytic reductions have become embroiled in controversy.

ln his paper Reductionism: Palaver without Precedent, physicist Hans Primas (1991) discusses six alleged theory reductions in the physical sciences, concluding that all six do not meet the requirements for a true analytic reduction. These are as follows:

1. Chemistry has not been reduced to physics: lt is impossible to derive the non-linear differential equations of chemical kinetics from linear quantum mechanics. Similarly, there does not exist a universal relation between chemical substances and molecules.

2. Chemical purity is not a molecular concept: The attempt to reduce chemical purity to a molecular concept involves the redefinition of chemical purity as a substance that is composed exclusively of atoms or molecules of a single kind. However, this redefinition excludes several chemically pure substances, such as liquid water, which is certainly not solely composed of H2O. A comprehensive discussion of this reduction can be found in Van der Waals (1927).

3. The theory of heat has not been reduced to statistical mechanics: There are several problems associated with this attempted reduction, including the impossibility of deriving the zeroth law of thermodynamics from statistical mechanics and the absence of Einstein- Podolsky-Rosen correlations in thermodynamics. For a comprehensive overview of the logical difficulties associated with this reduction, the reader is referred to Kline (1995), who has made a standing bet to all physicists (at ten to one in favour of the physicist) who believe they are able to reduce thermodynamics to statistical mechanics.

4. Temperature is not a molecular concept: The definition E=3/2kT does reduce temperature to a molecular concept. Primas (1991) claims that this example (cited again and again) bespeaks an incredible ignorance of even the most elementary concepts of physics.

5. Classical physics is not a limiting case of Einsteinian physics: Since no unique Galilean limit exists for Lorenz-covariant electromagnetic interactions, pre-Einsteinian physics cannot be reduced to Einsteinian physics.

6. Classical physics is not a limiting case of quantum physics: There is a widely held belief that classical mechanics is the limiting case for vanishing Planck's constant h. However, it has been proven Thal there is no universal classical limit of quantum mechanrbs (i.e. the limit h->0 does not exist in the norm topology). What is possible is that for small families of

t'u Where Tt and Tzare two theor¡es (including all relevant objects, properties, concepts and laws).

1 't6 quantum states, certain quantum systems may behave exactly as classical systems. However, whether a quantum system behaves classically or not, is not an intrinsic but a contextual property of the system.

Primas (1991) concludes his study by stating:

'"There exists not a single scientifically well-founded and nontrivial interdisciplinary example for a theory reduction in the sense of Hempel, Oppenheim, Nagel, Sneed or Stegmuller. The traditional concept of theory reduction rests on much too simple a view about the structure of scientific theories".

Further to these examples within the physical sciences, there are numerous examples in the natural, social and mathematical sciences including:

1. Attempts to reduce biology to chemistry: Despite all the progress of molecular biology and biochemistry, one is not in a position to remedy the absence of biological concepts in the vocabulary of chemistry, nor to give truly exhaustive definitions of the fundamental biological concepts in chemical terms. What has actually been done is the assumption of biological phenomena within a field of chemistry. This leads us to discover a whole series of chemical aspects of biological phenomena, but it does not imply epistemological reduction (Agazzi, 1991). Molecular biologists continue to dream of a unified discipline supplying explanations for all phylogenetic and ontogenetic development from primitive relations of genetic material. However, according to most macro-biologists such a reduction will never capture the systemic aspects of complex organisms, let alone the ecological systems in which they live and reproduce (Galison and Stump, 1996).

2. Attempts to reduce innovation theory to economics: There are a number of reasons why economic theory cannot produce the "how to" information needed to create innovations. Most of these reasons can be summarised by the fact that innovations occur within a hyperspace of much larger dimensionality than economic theory (hence economic theory must make use of a subsidiary set of assumptions and hypotheses for the reduction to hold) (Kline, 1995). There exists a large literature concerning the shortcomings of neoclassical economics as a predictive theory. For further information, the reader is directed to Klein (1977), Nelson and Winter (1982), Pofter (1990), Rothschild (1990), Kuttner (1991), Dietrich (1991), Kline & Kash (1992), Brooks (1993) and Kline (1995).

3. Attempts to reduce mathematics to logic: Bertrand Russell (1903) heralded the (apparent) triumph of this program in Principles of Mathematics stating:

117 '"The fact that all mathematics is symbolic logic is one of the greatest discoveries of our age; and when this fact has been established, the remainder of the principles of mathematics consrsfs in the analysis of symbolic logic itself."

The main hypotheses of the program to reduce mathematics to logic were that: all specific mathematical terms are definable on the basis of the logical vocabulary; all mathematical arguments could be formulated in the framework of logic; and that all specific mathematical theorems would follow from axioms of logic and logical inference rules (Russell, 1901; 1903; Russell & Whitehead, 1910; 1912; 1913). The essential limitations of this program were uncovered by the work of Godel (1931), Tarski (1944;1956) and others, in particular, Godel's incompleteness theorems, which refuted all three hypothesesl2e.

Perhaps the crucial problem that theoretical reductionism faces, is that, if accepted, then almost none of the knowledge associated with the so-called 'higher level' disciplines of chemistry, biology, psychology, sociology etc would be deemed scientific. lt seems that despite the huge amount of (micro) information that the analytic method has produced, it has been remarkably unsuccessful in providing knowledge of 'context-dependent'(macro) entities and their properties. lndeed it is from within the once bastion of reductionism, physics, that perhaps the most scathing attack on the analytic method has come in recent times. ln the words of Quantum Theorist, Hans Primas (1991):

"Most inter-theoretical relations are mathematically describable as singular asymptotic expansions. Asymptotic expansions are never universally convergent, but valid only in a vety specific context. ln such context-dependent descriptions the specification of the context is at least of equal importance as the first principles. ln other words, higher-level descriptions are compatible with the first principles, but cannot be derived lrom them alone.

The context-dependence of any description of nature is intrinsically related to the fact that we can say nothing about nature without making abstractions. We have to abstract from irrelevant features, but it is not laid down by natural laws what has to be considered as relevant and what as irrelevant. Without abstractions there is no science. The abstractions we introduce into experimental sciences have also to be introduced into a theoretical description by corresponding new contextual topologies, which as a rule induce symmetry breakings. Different branches of science are using different approaches, which can be characterised by diflerent topologies and symmetry breakings.

1" ln his ground-breaking, epoch-making theorem, Kurt Godel (1931) proved that a complete deductive system was impossible for even so modest a fragment of mathematics as elementary number theory. As Rudy Rucker (1982) has expressed it, Godel's (1931) Theorem leaves scientists in a position similar to that of Joseph K. in Kafka's (1916) novel lhe Trial: 'We scurty around, running up and down endless corridors, buttonholing people, going in and out of offices, and, in general, conducting investigations. But we will never achieve ult¡mate success,' there is no final verdict ¡n the coutt of science leading to absolute truth". However, Rucker (1982) notes: "To understand the labyrinthine nature of the castle [¡.e. courl] is, somehow, to be f ree oÍ it ...and there's no understanding of the coutt of sc¡ence that digs deeper into its foundations that the understanding given by Godel's Theorem".

118 As a rule, symmetry breakings induce the emergence of hierarchical levels but it would not be correct to say that nature is structured hierarchically. It is our viewpoint with the associated abstractions, which generate the higher levels of a hierarchical description. The task of a higher-level description is not to approximate the fundamental theory but to represent new viewpoints first principles are never sufficient for deducing higher-level theories. Nevertheless, given a well-specified context, highenlevel descriptions can be derived by asymptotic expansions. These expansions are not universally convergent but one can achieve convergence with respect to a new topology... this new topology introduces automatically new context-dependent obseruables, which are not already present in the fundamental description. In this mathematically precise sense, one can speak of the emergence of novelty in descriptions on a hierarchically higher level" (Primas, 1991).

Thus, the irreducible pluralism of languages and concepts of description in science is as desirable a feature as is the irreducible plurality of political views in a democracy (Suppes,

1 978).

2.1.4 Reductio Methodolog¡cae: Reduction¡sm and the Scientific Method

'What we obserue is not nature itself, but nature exposed to our method of questioning."

- Werner Heisenberg

As we have seen, ontological reductionism has been dismissed on the grounds that the atomism that it implies is too heavy a metaphysical presupposition. Similarly, the analytic method has been unable to adequately describe all macro-phenomena in terms of micro- phenomena. As such, Cartesian (theoretical) reductionism has also been dismissed. However, there is a form of reductionism not yet discussed, which seems at first glance, to be a neutral, perhaps even benign, use of an epistemic reduction to provide the basis of legitimation.

Suppose we accept ontologícal pluralism and take that each discipline busies itself with its specific context-dependent objects of its own ontological domain. However, we wish to partition those disciplines that study their objects in a scientific manner from those which do not. The question that immediately arises is: what does it mean to attribute to a discipline the qualification of being a science?

One answer to the above question that has enjoyed a certain intuitive appeal is the reductionist response that the discipline must be modelled after the fundamental science of physics. This condition has been explicated in two distinct ways. The first employs Cartesian

119 reduct¡on¡sm, by stating that the "modelling aftef'reÍers to the moment when the concepts used for the description and explanation in the discipline of concern can be redefined in terms of the concepts used in an established science. Given the problems associated with Cartesian reductionism have already been discussed, attention is restricted here to the second way in which the "modelling aftef'condilion has been met.

The second explication, which circumvents the problem of Cartesian reductionism, may be described as methodological reductionism. Methodological reductionism, as we have seen, is an ideology that seeks to construct a basis of legitimation for the methods of inquiry of the individual sciences. According to methodological reductionism, each of the individual scientific disciplines must adopt a method that is meta-methodologically identical to that of the most fundamental discipline. That is, each discipline must be reducible in methodology to physics. However, notwithstanding its initial appeal, this doctrine has a number of problems associated with it.

The first, and perhaps the most important problem, arises from an attempt to characterise the method of science (see Chapter 2.2).lf science is deemed to be empirical, then what are the canonical methods of experimentation and theory verification? lf it is positivistic, then by what method do we have recourse to mathematisation? lf it is characterised by its , then how are its conjectures to be falsified? Or if it is to be characterised by any of the sociological models, then how are its methods to be unified across research groups separated by domains of study (disciplines), sub-, metaphysical persuasion, or, perhaps most importantly, time?130

The second objection to methodological reductionism can be raised on the grounds that it implies a reductionist ontology (atomism). That is, it disregards that every discipline is concerned with its specific context-dependent ontological elements. By way of illustration, suppose that we settle on a single scientific methodology. For example, suppose we settle on science being characterised by the methodology of observation, followed by abductive hypothesis generation, mathematisation and the deduction of general laws and finally the inductive verification of these laws by repeated controlled experimentationl3l. lt may occur that the elements of a particular ontological domain are such that they lend themselves to controlled experimentation, or to mathematisation, however this surely depends on the nature of the context-dependent phenomena under study. Only an a priori commitment to the

"o An interesting study of different methodologies has been conducted by Knorr-Cetina (1931). Drawing an ethnographic comparison of high-energy particle physics and molecular biology, Knorr-Cetina concludes that there does not exist a s¡ngle meta-methodology that could unify the experimental practices of both disciplines. According to Knorr-Cetina, high-energy particle physics experimentation is characterised by a preoccupation w¡th instruments, methods, checks and cross-checks in an attempt by experimenters to disengage themselves form the phenomena under study. However, molecular biological experimentation, on the other hand, is characterised by direct contact with the phenomena. Knorr-Cetina invokes the metaphors of a control tower (high-energy particle physics) and a kitchen (molecular biology) to descr¡be the tvvo practices and argues that the by calling both practices 'exper¡mentation' scientists (and philosophers of science) obscure the fact that they represent two fundamentally different methodological cultures.

120 essential sameness of all things would imply that a single methodology could be found that applies to all areas of inquiry. Thus, it seems that methodological reductionism is parasitic on a kind of ontological reductionism. However, as we have seen, contextualism implies that methodological rules and heuristics emerge from the domain of investigation (or the context). Thus, methodological pluralism seems equally as reasonable as the ontological and theoretical pluralisms that have been argued for in the previous sections. Moreover, as will be discussed (Chapter 2.2), all attempts at characterising this supposed unitary method have failed. Accordingly, most contemporary philosophers of science have despaired of ever finding a single meta-methodology applicable to all areas of inquiry.

2.1.5 Reductio ad Absurdum: The lllegitimacy of Reductionism and Monism

"Understanding difference while respecting its diversity helps us to better appreciate the maruellous variety and richness of what exists."

- Evandro Agazzi

Evandro Agazzi (1991) states:

'The most generalcharacteristic identifiable in reductionr'sm rb that which may be called ïhe elimination of the difference', elimination consisting not so much in a search for that which might be common to two kinds of things, but rather in claiming that the perceptible difference is but apparent, affirming that, in reality, one of the two kinds embraces also the othef'.

Reductionism is associated with claims that there is an underlying unity to apparent multiplicity. That is, reductionism is a form of monism. The first Greek philosophers actually began by proposing their worldview from a monistic perspective and monism has never failed to manifest itself throughout history since (Agazzi, 1991). ln the previous Sections we reviewed some of the more prominent examples of reductionism in science. These included:

1. Ontological Reductionism - the use of reductionist ideology to construct a basis of legitimation for the ontologies of scientific disciplines.

2. Theoretical Reductionism - the use of reductionist ideology to construct a basis of legitimation for the theories of scientific disciplines.

ttt This is a highly simplified account of science, as will become apparent in section 2.2

121 3. Methodological Reductionism - the use of reductionist ideology to construct a basis of legitimation for the methods of inquiry of scientific disciplines. ln each of these, the basis of legitimation is constructed by reducing what is apparently complex, emergent and pluralto what is simple, fundamental and (ultimately) singular.

On the question of the relations between the various possible objects of study, the reductionist response is the one that monism provides. Ontological reductionism claims that there is but one type of (fundamental) substance in the world, the study of which would account for all observable phenomena, including chemical, biological, ecological, psychological, economic and societal phenomena.

On the question of the relations between the various theories, the reductionist response is again monistic. This time cultivating the absurd belief that there is but one conceptual framework (or context) into which all theories must be translatable in order to be legitimate. Such a position is reminiscent of the commensurability thesis critiqued in Part One. According to theoretical reductionism, 'higher-level' theories such as lhe Black-Scholes theory of pricing financial derivatives, for example, should be redefinable in terms of the quantum theory of the atom.

On the question of the relations between the various methods of inquiry, the reductionist response is once again the monistic one. According to the methodological reductionist, there is but one 'scientific method' and all other areas of human inquiry should adopt this method or be deemed illegitimate.

The conclusion that follows is that the use of the reduction operation as a criterion of legitimacy (usually expressed as 'scientificity'), is both philosophically naiVe, and an unnecessary constraint on our understanding of the "mawellous variety and richness of what exists". There does not exist one fundamental substance, the study of which would account for all phenomena, but many context-dependent entities. Furthermore, the theories developed to describe the behaviour of these entities (and methods used to develop these theories), are also context-dependent, and are unable to be derived from the theories (or methods) of another context. Thus, science is necessarily pluralistic. Fittingly, the arguments for a pluralistic approach will be strengthened in further chapters where the notions of 'scientific method'and 'scientific truth' are explored.

Much of the preceding discussion has touched on the difficult task of providing a precise characterisation of science (if such a meta-discipline exists). lt has been argued that reductionism is an inappropriate ideology to base any characterisation of 'scientificity'. This argument is highly contentious, as it is generally assumed that there is such a thing as 'the

122 scientific method' and that the use of this method not only secures the scientificity of various disciplines, but their legitimacy as well. Accordingly, the next Chapter looks into this issue in more detail.

123 2.2 Scientism and the Will to Methodology

"Science is a tool of the Western mind and with it more doors can be opened than with bare hands. lt is parl of and parcelof our knowledge and obscures our insight only when it holds that the understanding given by it is the only kind there is."

- CarlJung

Science is highly esteemed. Moreover, there is a widely held belief that there is something special about its methods. Something that unifies the various 'scientific' disciplines under a general banner and subsequently legitimates the outcome of their inquiries. But what exactly is this 'scientific method'? As we shall see, the answer to this question is by no means straightforward.

2.2.1 Francis Bacon and The Verificationist Account of Science

"Science is derived from the facts."

- Francis Bacon

A popular conception of science is captured by the slogan 'science is derived from the facts' (Chalmers, 2000). These facts are assumed to be claims about the world that can be directly established by a careful, unprejudiced use of the senses. Thus, science is based upon what we can see, hear and touch rather than personal opinion or speculative conjecture. Under this conception, the laws and theories that make up scientific knowledge are verified by . Once these laws have been verified they can be drawn on to make predictions and offer explanations. This account (known as the verificationist account of science) has a certain populist appeal and can be depicted as shown in Figure 3 below.

The crucial step in this process, the step by which the overall process gets its name, is step 4a: verification. Verification relies on the logic of induction, which states that:

"lf a large number of As have been obserued under a wide variety of conditions, and if all those obse¡ved,4s possess the property B, then bv induction all As have the property 8."

The use of inductive to 'verify'the results of science is often credited to Francis Bacon (1561-1626) and received widespread popularity during the Enlightenment, with lsaac Newton expounding its in his Opficks:

124 "Analysis conslsfs of making experiments and obseruations, and in drawing general conclusions from them by induction, and admitting no objections against the conclusions but such as are taken from experiments or other ce¡1ain truths" (Newton, 1704).

New Conjecture or Theory (Speculative)

The Generation ol Hvootheses The lmplications of Hvpotheses

1. Abduction 2. Deduction

Verifiable Consequence Observation (Deducible f rom conjecture) (The beginning of scientific inquiry)

4b. ReÍutation 3. Experimentat¡on

The Falsification of Hvpotheses The Testinq of Hvpotheses

Observation (of experimental results)

The Verification of 4a. lnductíon Hvpotheses (and the Establishment of Scientific Truths)

Verified Theory (The result of successful science)

The Use of 5. Deduction Established Scientific Truths

Pred¡ction and Explanation (The end of scientific inquiry)

Figure 3: The Verificationist Account of Science

According to the verificationist, scientists formulate theories in order to explain perceived patterns in the world and verify these theories by accumulating supporting evidence. As such, is an empiricist approach to science. The crucial step in the process, as we have seen, is the one involving the use of inductive logic. According to the inductivist, the principle of repeatability is what puts any knowledge that may be called 'scientific' in a different domain from opinion or taste. ln this sense scient¡fic knowledge is often described as 'public knowledge' - we have no choice but to accept that which can be repeatedly demonstrated in experiments.

125 Notwithstanding its intuitive appeal, there are certain aspects of the verificationist account that may attract critical attention. These have to do with the (inductive) logic of verification. After the results of an experiment become known, it is important to realise what has been proven and what has not been proven. What has been conclusively demonstrated is that at time, I, under conditions, C, according to observers, O, experiment, E, yielded results, H. Every opinion about the experiment or theory based on the outcome of the experiment, however, may be disputed. This is due to the fact that induction is not a logically conclusive form of proof.

The study of abductive, deductive and inductive forms of reasoning constitutes the discipline of logic, a detailed account of which is outside the scope of this work. However, what must be stated is that of the three forms of reasoning involved in the verificationist account of science, only deduction can lead to logical proof. An example of a simple logical deduction is:

1. Allpapers on systems engineering are boring 2. Ih,s ts e Deoer on s enrilneenno

3. Therefore, this paper is boring

ln this argument (1) and (2) are the premises, and (3) is the conclusion. lt follows that if the premises are true, then the conclusion must also be true132. This is the key feature of a logically valid deduction. Alternatively, a simple inductive argument is:

1. Many papers on systems engineering are boring 2. This is e DaDer on ênntncênnd 3. Therefore, this paper is boring

ln this example, the conclusion does not necessarily follow from the premises, even if the premises are true. lt is impossible to logically prove (3) from the 'facts' of (1) and (2).

A simple example of an attempt to base a scientific law on inductive forms of reasoning is provided below:

1. Metal x1 expanded when heated at time h. 2. Metal x2 expanded when heated at time t2.

n. Metal x" expanded when heated at time tn, n+1. Therefore, all metals expand when heated.

tt'What deduction is silent on is the truth or falsity of the premises themselves. Thus, if someone where to dispute either of the premises in the example above, then although they would have to accept the deduction, they could still, nevertheless, dispute the truth of the conclusion.

126 However, repeated observations of metals expanding when heated can never 'prove' the conclusion that all metals expand when heated. No matter how large n becomes, there can be no guarantee that a sample of metal at some stage in the future will not contract.

The problem with proof by induction was understood as early as 1741 when David Hume published his Treatise on Human Nature (Hume, 1741). Briefly summarised: multiplying confirmatory observations will never get us any closer to proof - there is always the possibility that further observation may reveal exceptions that disprove the rule133. According to Hume, we discover the 'rules' of induction by observing what we habitually accept. Hume's attack on the inductive method initially fell on deaf ears. Bell (1994) argues that the reason Íor ils "chilly reception" was that it threw doubt on two of the most cherished beliefs of the modern world: first, the belief in the certain truth of Newtonian physics, and second, the belief in the ability of the experimental 'scientific' method to yield the true nature of the world. ln his treatise, Hume argued that induction may yield probable truth and it was this suggestion that eventually led to the modif ication of the verificationist account in the 19th century.

The first serious attempt to modify the inductive dogma of Francis Bacon and his contemporaries involved weakening the demand that scientific knowledge be proven to be absolutely true. Under this modified verificationist account of science, scientific knowledge is deemed 'probably true' in light of the evidence. Thus, the vast number of observations of metals expanding when heated warrants the assertion that the claim that 'all metals expand when heated' is probably true. The logic of induction can, therefore, be reformulated as:

"lf a large number of As have been obserued under a wide variety of conditions, and if all these obserued As have the property B, then bv induction all As probably have the property B.u

A far-reaching (and probably unintended) consequence of the new probabilistic version of induction was the spur it gave to the study of probabilitytheory (Salmon, 1970). Perhaps the most important development for our purposes came from the work of Thomas Bayes. ln his famous Essay Towards Solving a Problem in the Doctrine of Chances, Bayes (1763) set out to calculate the probability of an 'outcome' given prior knowledge. His subsequent theorem (Bayes Theorem) involves assigning prior probabilities of a given phenomenon occurring before trials are made. From these probabilities, and observational evidence, it is possible to determine the posterior probability that a given event will occur. When used in an iterative

ttt Bertrand Russell made the point metaphorically in his story of the inductivist turkey (Russell, 1912). This turkey found that, on his first morning at the turkey farm, he was fed at 9am. However, being a good inductivist, he did not jump to conclusions. He waited until he had collected a large number of observations under a variety of conditions of the fact that he was fed at gam. Finally, he was satisfied and inferred that "l am always fed at 9am". Alas, his conclusion was shown to be false on Christmas morning, when his throat was cut.

127 fashion, Bayes'formulae 'wash out' any arbitrariness in the assignment of prior probabilities. This recursive use of Bayes theorem is known as the Bayesian method. Because the Bayesian method allows posterior probabilities to 'over-ride' prior probabilities, it provides a link between estimated and actual probabilities of an outcome. lt is this link, which gave a new hope for the logic of induction and the verificationist account of science.

Unfortunately for the verificationist, however, this hope was never to be realised (Chalmers, 2000). To calculate the probability of interesting scientific theories requires designing the constituent elements of the outcome. For example, to calculate the probability that Newton's gravitation is true requires assigning the prior probabilities of the gravitational attraction of the Moon to Earth, Sun to Mars efc. lt is mathematically demonstrable that these prior probabilities will be'washed out' with repeated use of Bayes formula, however problems arise due to the fact that there can be no reliable way of determining the posterior probabilities necessary to wash them out. One simply cannot devise an empirical test to determine the posterior probability of gravitational attraction between any two objectsl3a.

ln summary, Bayes' formulae cannot fulfill the hope that theories can be inductively securedttu. This seems to cast doubt on the empiricist's belief that science is derived exclusively from observable facts. According to a group of philosophers and scientists congregating in and around Vienna in the early 2Oth century this was of such concern that they pursued a major research program aimed at salvaging lhe empirical basis of science. The size, scope and vigour with which this research program was pursued warrants a separate discussion.

2.2.2 The Vienna Circle and The Positivist Account of Science

"lt is mathematical representation alone that makes possible the notion of objective knowledge."

- Moritz Schlick

With the fall of induction as a logically valid method on which to base scientific claims to truth, a group of philosophers who became known as the 'Vienna Circle' took up the challenge of finding a replacement. After a significant amount of effort, the school of thought that emerged became known as positivism.

lsAnother problem with the probabilistic form of induction is that attempts to justify it ultimately appeal to exactly the same form of inductive argument as the original (Chalmers, 2000). ttu They have, nevertheless, been profoundly ¡mportant in theoretical statistics and usefully applied in modelling many unanticipated phenomena.

128 The word positivism comes from the French word posifen to posit. lt was first employed by Auguste Comte (1893), who, following Hume and the empiricists, argued that science should be based on what is 'posited' to our immediate senses. As such, speculative assertions not reducible to observation must be excluded from science. For this reason, positivism came to mean 'anti-metaphysicsl3u'. When formal systems, such as mathematics, are applied to what is'posited', positivism becomes logical positivism.

The rise of logical positivism in the early 20th century was sparked by the need to understand the impact of contemporary developments in theoretical physics on the philosophy and methodology of science (Ayer, 1959). At the time, physics was in a state of upheaval with Einstein's (1915) General Relativity, Bohr's (1928; 1935) Quantum Mechanics and Heisenberg's (1927) Uncertainty Principle each contributing to what was to become the overthrow of the Newtonian cosmology. ln the face of this upheaval, the Vienna Circle aimed to ensure that the empirical foundations of science were not discarded. lndeed, it is within the positivist schoolthat the empiricist tradition reaches its most sophisticated form.

The Vienna Circle was initially known as the Ernst Mach Society after the scientist and philosopher of the same name. Mach believed that'scientific' knowledge must be considered to be 'true' and in a different domain from that of opinion or taste. Thus, he considered scientific modes of inquiry as being able to produce 'objective' and 'verifiable' 'knowledge', whereas all other modes of inquiry as only ever producing 'subjective' and 'unverifiable' 'opinion'. Mach was an arch empiricist and argued that the entire content of science should consist of the relationships among the data of our sense-experience (Ray, 2000). Scientific concepts and generalisations, according to Mach, do not exist (ontologically) in their own right. They are merely names of particular empirical objects (a position known as '') and are meaningful only inasmuch as they are grounded in observation. Any concept that cannot be reduced to observation, according to Mach, was meaningless. As we have seen, it was this position that Quine (1953a) referred to as'reductionism'in his paper Two Dogmas of Empiricr'sm. The members of the Vienna Circle, including A. J. Ayer, Rudolph Carnap, Herbert Fiegl, Hans Hahn, Carl Hempel, Otto Neurath, Moritz Schlick and Friedrich Waismann, were initially bound by this common ' reductionist' belief. Moreover, they believed that all 'scientific' modes of inquiry would one day be unified into a single meta-methodology (a position that has been termed in the present work'methodological reductionism').

During the 1920s, the logical positivism of the Vienna Circle formed an alliance with the logical empiricism of the 'Berlin School'. The similarities of both schools were far-reaching: both insisted upon empiricism; both emphasised the importance of logic; both looked to the

"u As we have seen (Section 'l .1), the term 'metaphysics' does not have a precise or agreed upon meaning (for that matter neither does science, as we shall see later). ln positivist philosophy of science, metaphysics is used as a pejorative term generally applied to whatever is regarded as non-empirical. Traditionally, however, metaphysics has been regarded as the study of what lies behind the world of appearance. lronically, a great many people would regard the sciences of physics, chemistry and astronomy as fitting this descript¡on.

129 physical sciences as the paradigm of objectivity and both completely rejected metaphysics. The alliance between the two schooló was further formalised with the founding of the Journal Erkenntis in 1930, with Rudolph Carnap (Vienna) and (Berlin) the co- editors. ln attempting to maintain the privileged position of the exact sciences, the positivists first made some significant concessions. These concessions were part of an ambitious attempt to completely reformulate the verificationist model and hence circumvent some of the well- known problems associated with induction. Thus, Reichenbach (1920) in the introduction to his book on Einstein's relativity writes:

"Every factual statement, even the simplest one, contains more than an immediate perceptual experience; it is already an interpretation and therefore itself a theory ... the most elementary factual statement, therefore contains some measure of theory".

This clear statement of what is now called 'the theory-ladenness of observation' represented a move away from the received (empiricist) dogma of the 19rh century that facts always precede theory. Carnap (1928) also endorsed this view in his classic of positivist thought The Logical Structure of the World.ln response to these concessions, the positivists modified much of the terminology of empiricism. For example, 'facts', which imply some direct, theory-neutral, connection to the underlying 'real world', became known as 'data' (which was considered to only imply observational input). ln this sense, the logical positivists began to embrace a kind of , which views propositions asserting the existence of physical objects as analytically equivalent to propositions asserting that subjects would have certain sensations were they to have certain others. As such, the positivists turned their attention to the analysis of these sensations, developing a sense datum theory (Fumerton, 1992)137.

The positivist adoption of a phenomenalisl epistemology raised a whole set of new difficulties associated with the scientific claim of objectivity. According to phenomenalism (and its more recent variant, phenomenology), the only contingent propositions (recall Kant's distinction between contingent and necessary propositions) that can be known directly, are those describing the contents of our own minds and if any belief about the 'real world' is to be justif ied, it must be inferentially justified from what we know about our minds. ln order to solve the problem of objectivity Schlick (1918) claimed that by ordering, interpreting, and structuring the data of our sensory perceptions within a rigorous mathematical framework it was possible to 'objectify' perceptions and transform them from 'appearance' into 'experience'. ln other words, it is mathematical representation alone that makes possible the notion of objective knowledge. Thus, Schlick (1918) draws the distinction between knowledge of t"t Where were viewed as mind-dependent entities.

130 cognition (erkennen) and acquaintance with the immediately given sensory perception (erleben). The later, since it is momentary, was thought to be incapable of yielding knowledge. According to Schlick, knowledge is possible only when we embed such momentary perceptions within a rigorous mathematical system. The idea being that in the face of the theory-ladenness of observation and hence the subjectivisation of perception (erleben), there only remains one objective component of knowledge and that is logic itself (erkennen). According to the logical positivists, the discipline of mathematical physics was the obvious shining example of perceptions being embedded within a rigorous mathematical framework and was therefore paradigmatic of objectivity and rationality and deserved pre- eminence amongst the sciences.

The difficulties associated with using formal systems (such as mathematics) to 'objectify' sense experience began to arise with the fall of the Kantian conception of the transcendental unity of apperception. According to Kant, all rational beings effectively bring to every observation and interpretation of the world the same set of a prioriconcepts. When applied to Kant's spatiotemporal construction, the doctrine of the transcendental unity of apperception effectively states that the space and time underlying the constructive procedures of pure mathematics are the very same space and time within which we perceive and experience nature through the senses (i.e. they are the a priori intuitions that form the sensible preconditions of objects of experience). lt is this conception of the intuitive nature of pure mathematics that enabled Kant (1781) and others to explain how mathematics, in its full precision, is applicable to the chaotic world of sense. However, with the realisation that pure mathematics no longer requires a basis in spatiotemporal construction, but can instead proceed 'formally' via strict logical deduction within an axiomatic syslem, it was no longer possible to maintain that any mathematical theory has a necessary relation to our sensory perceptions. Hence, there is no longer a single privileged framework. Many such frameworks are possible and some of them were being applied to Einstein's theory of relativity at the time.

The positivist attempt to preserve a basically Kantian conception of knowledge and experience in the face of the collapse of Kant's doctrine of spatiotemporal construction created fundamental, and ultimately unresolvable tensions within their entire program. The problem confronting them can be expressed as follows:

1. They wished to maintain the privileged position of the exact sciences (especially mathematical physics) in the face of the rising problems associated with the verificationist account of science and the overthrow of the Newtonian cosmology.

2. ln doing so, they dismissed causal speculation from science and relied on Kant's doctrine of the transcendental unity of apperception to secure perceptions within a

13'l rigorous spatiotemporal mathematical framework. Thus, it was assumed that mathematics provided the theoretical framework needed to confer objectivity and rationality onto sensory perception.

3. Yet with Hegel and Nietzsche's critique of the transcendental unity of apperception (and with it the collapse of Kant's understanding of spatiotemporal construction) they understood that there is no longer any single privileged framework that can alone perform this 'objectifying' function. On the contrary, every framework appears to exemplify its own particular standards of objectivity and rationality. ln response to this, the Marburg School of Hermann Cohen, Paul Natorp and drew explicitly relativist conclusions (Cassirer, 1923i 1925;1929; 1942; Schilpp, 1949; Hazelrigg, 1989; Friedman,2000). Since there is no longer a single privileged framework, each framework may supply its own standards of truth, and hence objectivity. This position led to the 'coherence theory of truth', which views coherence and consistency within a particular framework (or set of assumptions) as sufficient for 'truth' (see Chapter 2.3). Since the positivists wished to maintain the privileged position of mathematical physics, however, the relativism of the Marburg School was anathema to them and they went to great lengths to try to avoid the Marburg conclusions. Accordingly, the later positivist writings revolve around the problem of adjudicating between competing frameworks.

The adjudication problem was exacerbated when Schlick (1915) admitted that both Einstein's relativity and the classical explanations of Lorenz, Fitzgerald and Poincare could explain the data from the Michaelson-Morley experiment of 1887. The two theories, its seemed, led to all the same empirical predictions. Thus they were deemed to be empiricalty equivalentl3s. Schlick admitted that positivism had no answer to the question of adjudicating between empirically equivalent frameworks and that Einstein's theory was more likely to be lrue because it was simpler than the competing aether theory (Schlick, 1915). However, there is no clear reason to believe that simplicity is a reliable guide to truth. As Friedman (1992) has questioned: '\uhy in the world should nature respect our, merely subjective, preference for simplicity?'t3s

Throughout the inter-war years, a vigorous research program pursued by the positivists attempted to solve the problem of adjudicating between empirically equivalent theories of explanation. The outcome of all of this was the doctrine of , which states that empirically equivalent theories (such as relativity and the aether theory) are really not two conflicting theories at all. Their disagreement is only apparent and so there is no need to ttt This is an early statement of the under-determination thesis, which states that no body of evidence can support any theory to the exclusion of all rivals. 13e The retreat to simplicity is a long-standing epistemic cop out. lt can be traced to 's (1300-1349) principle, known as Ockham's razor, which states that 'fuhen faced with competing explanations, accept the most simple", and ult¡mately to Aristotle who claimed lhat "nature operates ¡n the shoftest way possible".

132 adjudicate between them. Rather than a disagreement over truth, there is only a pragmatic question of convenience. ln this sense, the choice is purely a matter of convention.

A logical consequence of conventionalism was that empirical facts were deemed to be the only truth and that the 'cognitive meaning' of a scientific theory consisted solely in its implications for actual and possible observations. lndeed, the positivists acknowledged this with the notorious Verifiability Principle, which was wielded to question the 'cognitive meaningfulness' (or legitimacy) of all discourse regarding unobservableslao. Happily, the Verifiability Principle could not be sustained. First, because it proved impossible to view advanced theories, such as Einstein's relativity, as mere summaries of actual and possible observations. That is, scientific theories often embody knowledge of unobservable phenomena (in the positivist's language this means that scientists routinely do metaphysics). Second, and perhaps more fundamentally, the notion of 'observable facts' required the possibility of theory-neutral observation, a notion that the positivists had explicitly rejectedlal.

The story of logical positivism is a story of failure and has been discussed as such by various commentators (Andersson, 1994; Bell, 1994; Friedman, 1992; Laudan, 1996; Ray,2000; Stroud, 2000). ln particular, the positivists failed to develop an account of science based solely on the relationships between the data of sense experience. Accordingly, in the aftermath of the positivist episode, it has been common to afford metaphysical (non-empirical) speculation a place in science. However, this has raised a whole new set of difficulties associated with the status of unobservable entities. Several attempts have been made to clarify the new situation over the last 50 years. Out of these attempts, it may be argued that two broad positions have arisen. On the one hand are those who argue that scientific theories are true by virtue of their correspondence with the 'real', transcendental truth of the matter,

140 The Verifiability Principle, the reader may have noted, is remarkably similar to Peirce's 'pragmatic maxim' and the similarities between the two help lustify the assert¡on (made earlier) that Peirce actually began mapping out his position in largely positivisVempiricist terms. 141 The Verifiability Principle also leads to what is now known as the paradox of confirmation. The paradox arises from an attempt to characterise the relation between hypothesis and evidence. The most celebrated example, the raven paradox, begins with the hypothesis "all ravens are black", symbolised as (x)(Rx->Bx). According to Nicod's condition a hypothesis is verified by its positive instances and falsified by its negative ones. Thus the observation "this is a raven and it is black", (Ra^Ba), verifies the hypothesis and the observation "this is a raven and it is not black", (Ra^-Ba), falsifies the hypothesis. The paradox arises from a fundamental principle of confirmation known as 'Îhe equivalence condition". According to the equivalence condition, if a hypothesis is confirmed by some evidence statement, E then it is confirmed by any other evidence statement logically equivalent to E ln light of the equivalence condition, the logical equivalence of "all ravens are black" and "all non-black th¡ngs are non-ravens" implies that these statements are supported by the same body of evidence, E. Therefore (and here is the paradox) the observation of a non-black, non-raven (-Ba^-Ra), such as a blue book, is a confirming instance of the hypothesis. Further, the hypothesis (x)(Rx->Bx) is logically equivalent to (x)[(Rx v -Rx)->(-Rx v Bx)]. Here the antecedent is a tautology, and thus any truth-value assignments that make the consequent true will confirm the hypothesis that all ravens are black. Since the consequent is a disjunct and the extensional deductive model of hypothesis testing requires only one of the disjuncts to be true for the compound to be true, then '1al/ ravens are blacK'is logically verified by any observation that is either black or a non-raven. Surprisingly then, the discovery of a star (or indeed the four-inch refractor used to view the star) confirms the generalisation that all ravens are black. These results should be disturbing to anyone who desires an account of science based upon empirical confirmation. Philosophers and logicians have handled the paradox in a variety of ways. Some have rejected Nicod's condition (which would be disastrous for an empirical/positivist account of science), others have abandoned the equivalence condition. However, many have argued that this paradox shows that scientific observations are systematically defective, since they would not give equal weight to observations, "ignoring the blue book and attending to the black raven" (Hempel, 1945; 1965; 1966; Goodman, 1954). For further elaboration of the paradox of confirmation the reader is referred to Trout (2000), from which much of this discussion has been sourced.

133 whilst on the other, are those who argue that no such correspondence can ever be substantiated and therefore scientific theories are merely 'instruments' for helping us correlate observational data and make predictions. As we shall see later (Chapter 2.3), these positions correspond to two separate understandings of truth. Furthermore, they also correspond to two separate understandings of scientific progress: one which claims that current scientific theories are 'truer' than their predecessors (in the sense that they more accurately represent the real world); the other which claims rival theories cannot be compared solely on objective, rational grounds and therefore current scientific theories are not necessarily'truer'than the ones they replace. Whilst there are several scholars who argue for both positions, two that stand out as representative (and hence warrant a separate study) are Karl Popper and Thomas Kuhn.

2.2.3 Karl Popper and The Falsificationist Account of Science

'"The wrong view of science betrays itself in the craving to be right."

- Karl Popper

Karl Popper was educated in Vienna in the 1920s at a time when the Vienna Circle (and logical positivism) were at the height of their influence. Popper himself tells the story of how he became disenchanted with the positivist school and their belief that science was especially reliable because it was derived from'empirical data'(Popper, 1979). lt was against this setting that Popper developed his critique of the verificationist account of science leading to his, now famous, split with the positivistsla2. lntellectual life in the Vienna of Popper's youth was dominated by science-based ideologies. Much of this was due to the privileged position science found itself in due to the wide-spread intuitive appeal of verificationism. Popper, however, became suspicious of the way in which he saw Freudians, Darwinists and Marxists supporting their theories by an appeal to the same empirical verification. Acceptance of these new'sciences' had as Popper observed:

'"The effect of an intellectual conversion or revelation, opening your eyes to a new truth hidden from those yet initiated. Once your eyes were thus opened you saw confirming instances everywhere: the world was full of verifications of the theory. Whatever happened always confirmed it. Thus, its truth appeared manifest; and unbelievers were clearly people who did not want to see the manifest truth." (Popper, 1969).

142 lndeed, the ensuing debates between Popper and the positivists (especially Carnap) were to become a feature of philosophy of science for the neld four decades.

134 It seemed to Popper that these theories could never go wrong because they were sufficiently flexible to accommodate any instance of human behaviour or historical change as compatible with their theoryla3. Consequently, although giving the appearance of being powerful theories confirmed by a wide range of facts, they could in fact explain nothing because they could rule out nothing. Popper, on the other hand, thought that a theory with genuine explanatory power would make risky predictions. Predictions that could be tested and, if they did not obtain, would refute the thebry. For example, Einstein's theory had the implication that rays of light should bend as they pass close to massive objects (such as the Sun). As a consequence, a star situated beyond the sun should appear displaced from its usual position in the absence of this bending. Eddington looked for this displacement by viewing the star at a time when the light from the sun was blocked out by an eclipse. As it happened, the displacement was observed and therefore Einstein's theory became widely accepted (or verified as the positivist would claim). The point that Popper makes is that the apparent position of the star might not have been displaced. By making a specific, testable prediction that is logically deducible from the generaltheory, the theory stands to be falsified by observational evidence.

Popper's contribution to the positivist debate was, therefore, to completely discard two of the fundamental tenets of the school: the idea that science did not encompass metaphysics and the idea that theories were scientific if and only if they were verifiable (the verifiability principle). According to Popper, science begins with theories (or 'conjectures' as he put it) construed from imaginative or even mythological speculations about the world. These conjectures provide the starting point for scientific investigation. ln this sense, Popper stands in direct opposition to the empiricism of the positivists who denied a place for non-empirical structural conjecture in science. Furthermore, according to Popper, progress is made not by searching for confirming evidence, which can always be found, but by searching for falsifying evidence. lt is falsification that progresses science by revealing the need for a new and better explanation. Thus, science progresses by trial and error, or as Popper would have it, by conjectures and refutations. Only the fittest theories survive. Although it can never be said of a theory that it is true, it can always be said that it is the best available and that it is better than any that have come beforet*. Popper thus argues that the defining characteristic of scientific theories is not their verifiability but their falsifiability

(Popper, 1 969, 1 972).

Popper's falsifiability criteria, thus distinguishes science from other intellectual pursuits, among which he includes pseudo-science and metaphysics. ln sharp contrast to the logical positivists, he refused to equate non-science with non-sense and thence nonsense. According to Popper, it is sensible to include non-sensical knowledge in science. lndeed, science begins with non-sensical (or rational) conjecture about causes and effects. lt

t* This observation was an early statement of what is now known as the under-determination thesis (see Chapter 2.3). too Here Popper puts forward an ontological theory of truth (see Chapter 2.3).

135 progresses through empiricalfalsification and it never comes to rest in the sense of arriving at truth (as the verificationists believed). Although they cannot be tested scientifically, metaphysical (or pseudo-scientific) doctrines are often meaningful and important. Popper even credited pseudo-scientists like Freud and Adler with valuable insights that might one day play their part ¡n a genuine science of psychology. His criticism was not that psuedo-scientific or metaphysical theories were nonsense, but merely that it is incorrect to believe these theories could be verified by searching out supporting evidence. lt is this view of science that led Popper (1969) to describe it as:

"The method of bold conjectures and ingenious and severe attempts to refute them."

The basic function of a scientist, according to Popper, is to: 'test theories, deduce consequences of theories and discover whether these consequences obtain. lf they do not, the theory is refuted, if they do, the theory suruives. The more tests the theory passes, the more credible it becomes". A visualisation of Popper's account of science (known variously as the falsificationist, hypothetico-deductivist or deductive-nomological account of science) is provided in Figure 4 below.

New Con¡ecture or Theory (Speculative)

The Generation of Hvpotheses The lmplications of Hvootheses

1. Abduction 2. Deductíon

Verif iable Consequence Observation (Deducible f rom conjecture) (The beginning of scientific inquiry)

4. Refutation 3. Experimentation

The Falsification of Hvootheses The Testino of Hvootheses

Observation (of experimental results)

Figure 4: The Falsificationist Account of Sc¡ence

According to Popper, his account of science has solved the long-standing problem of induction by defining science as an activity that does not involve inductive inferences at all (Popper, 1972). lnstead of induction, deduction is used to reveal the consequences of theories so they can be tested and, perhaps, falsified. A feature of Popper's view of science is

136 that no claims are made that the survival of tests shows a theory to be true or even probably true. At best, the results of such tests show a theory to be an improvement on its predecessor. As such, the falsificationist settles for progress rather than truth. For this reason a hypothesis refuted is more valuable than one that survives the test (Popper, 1972\. Popper put the essential point in a marvellous aphorism:

'The wrong view of science betrays itself in the craving to be right" (Popper, 1969)

Thus, it is only after an established theory has been refuted that a new (and better) one is proposed. Scientific progress can thus be visualised as a Burkean spiral, as new theories (with better explanatory and predictive power) replace old ones that have been empirically refuted (see Figure 5). lt is this conception of progress that made Popper believe that scientific theories approach a 'correspondence' with the truth of the underlying nature of the 'real world'.

s @ ! = j E =E 'c d o =o o o od oÐ 0 o_ i

Figure 5: Scientific Progress According to the Falsificationist School

According to Popper, a scientific experiment is one in which some significant conjecture is at risk. This implies that every scientific experiment implicitly, or explicity embodies some theory and that this theory stands to be refuted by the results of the experiment (note that it can never be proved). As we have already mentioned, Popper uses the example of Einstein's falsifiable conjecture that (according to his theory of relativity) light passing near the Sun

137 should be bent by it1a5. Popper was impressed by the contrast between Einstein's bold and falsifiable conjecture on the one hand and the pseudo-scientific schools of the Marxists, Darwinists and Freudians that dominated the Vienna of his formative yearsla6 (Magee, 1985).

Despite the enormous success of the falsificationist account of science in circumventing the problems of induction (by defining science as an activity that does not involve induction), it suffers from a number of inadequacies. These stem from both the logic of refutation and the socio-historical applicability of the theory in practice.

The first set of objections to the falsificationist account of science can be raised on logical grounds. Simplistically, the logic of falsification is said to follow the law of Modus Tollens, which has the form: Premise 1: lf H then O Premise 2: But -O Conclusion: Therefore, -H

Here, the first premise refers to the test statement associated with the falsification and the second to the empirical basis of the falsification. Obviously, given the truth of both premises, the conclusion holds by simple deduction. However, objections have been raised in regard to the truth of both of these premises. ln regard to the truth of test statements, objections have been raised on the grounds that it is always possible to protect a theory from falsification by deflecting the falsification onto some other part of the complex web of assumptions that face the tribunal of observation in every experiment. This position is known as the Quine-Duhem thesis: the thesis that most advanced scientific theories are a complex interlocking set of concepts, observations, definitions, presuppositions, experimental results and connections to other theories and that no single fact is going to be crucial to the survival of the theory. ln fact, facts radically under- determine theories, the consequence of which is that it is impossible, on purely logical grounds, to falsify a sophisticated scientific theory experimentally - any observation can be accommodated by making suitable adjustments to the 'web of beliefs'. A corollary of the Quine-Duhem thesis is the underdetermination thesis, which states that when comparing sophisticated scientific theories there does not exist a crucialexperimentlaT.

tot When the results from the 1919 eclipse experiment became known, Professor Littlewood sent an excited note to Bertrand Russell; "Dear Russell, E¡nste¡n's theory is completely confirmed" (Checkland, 1981b). According to Popper, this sort of rhetoric highlights a profound misunderstanding of the nature of scientific knowledge. Perhaps a better response would have been along the lines of "Einstein's theory has survived this difficult test". tou Popper's posit¡on on such pseudo-sciences is perhaps best presented in his paper Darwinism as a Metaphysicat Research Programme, in which he claims that evolut¡onary theory was not a scientific theory capable of passing the tp.Ét of falsification, but a metaphysical research programme (Popper, 1974a). tot An experiment E is cruc¡al between Tt and Tz if and only if fr predicts that Ewill yield O and 12 predicts that Ewill yield -O. A classic example of this is Foucault's experiment designed to decide between the wave and particle theories of light. According to the wave theory, the velocity of light in water should be less than the velocity of light in air, whilst according to the particle theory, the velocity of light in water should be greater than the veloc¡ty of light in air. Foucault's experiment supported the wave theory. However it was noted that in deriving -O from the particle theory a set of auxiliary assumptions were required, thus, the particle theory could always be saved by altering the

138 Lakatos (1977) made a similar point when he stated:

"ls, then Popper's falsifying criterion the solution to the problem of demarcating science from pseudo-science? No. For Popper's criterion ignores the remarkable tenacity of scientific theories. Screnfisfs have thick skins. They do not abandon a theory merely because facts contradict it. They normally either invent some rescue hypothesis to explain what they then call a mere anomaly, or, if they cannot explain the anomaly, they ignore it, and direct their attention to other problems."

Moreover, he demonstrated how this may work by making use of a hypothetical scenario involving Newtonian gravitation:

"A physicist of the pre-Einsteinian era takes Newton's mechanics and his laws of gravitation, N, the accepted initial conditions, l, and calculates, with their help, the path of a newly discovered small planet, p1. But the planet deviates from the calculated path. Does our Newtonian physicist consider that the deviation was forbidden by Newton's theory and therefore once established, it refutes the theory N? No. He suggests that there must be a hithe¡to unknown planet p2, which periurbs the path of p1. He calculates the mass, orbit, etc. of this hypothetical planet and then asks an experimental astronomer to test his hypothesis. The planet p2 is so smallthat even the biggest available telescope cannot possibly obserue it; the experimental astronomer applies for a research grant to build yet a bigger one. ln three years time, the new telescope is ready. Were the unknown planet p2 to be discovered, it would be hailed as a new victory of Newtonian science. But it is not. Does our scientist abandon Newton's theory and his idea of the perturbing planet? No. He suggesfs that a cloud of cosmic dust hides the planet from us. He calculates the location and properties of this cloud and asks for a research grant to send up a satellite fo tesf his calculations. Were the satellite's instruments to record the existence of the conjectural cloud, the result would be hailed as an outstanding victory for Newtonian science. But the cloud is not found. Does our scientist abandon Newton's theory, together with the idea of the perlurbing planet and the cloud, which hides it? No. He suggests that there is some magnetic field in that region of the universe, which disturbed the instruments of the satellite. A new satellite rs sent up. Were the magnetic field to be found, Newtonians would celebrate a sensational victory. But it is not. ls this regarded as a refutation of Newtonian science? No. Either yet another ingenious auxiliary hypothesis is proposed or ... the whole story is buried in the dusty volumes of periodicals and the story never mentioned again" (Lakatos, 1970).

auxiliary assumptions. The under-determination thesis, together with the related Quine-Duhem thesis, provides a poweÌful rebuttal to the falsificationist account of science.

139 Thus, sophisticated theories are not only unverifiable but unfalsifiable as well. Or, in Lakatos' (1970) words: "scientific theories are not only equally unprovable and equally improbable, but they are also equally undisprovable."

The difficulties facing the logic of falsification, however, are not limited to the supposed 'truth' of test statements (Premise 1). The supposed'truth'of the empirical basis of the falsification (Premise 2) has also come under attack. As we have seen, the 'theory-ladenness of observation' states that judgements direct what a scientist observes and what s/he passes over (Bhaskar,'1986; Hollway, 1989). In other words, what an observer sees is not determined solely by the images on their retinas, but on the experience, knowledge and expectations of the observer (Chalmers, 2000). Consequently, when observation (or experimentation) provides evidence that conflicts with theory, it may be the perception of the evidence that is at fault rather than the scientific theory. Nothing in the logic of the situation requires that it is always the theory that should be rejected on the occasion of a clash with observation.

Thomas Kuhn (1970a) raised a number of examples from the history of science where observations have been influenced by the expectations of the scientist. For example, Kuhn

(1970a) tells the story of the 'discovery' of the planet Uranus by Sir William Herschel in 1781 . What Kuhn finds interesting about this story is that Uranus was actually'discovered' (in the sense that it was observed) on at least 17 different occasions between 1690 and 1781 but each time was held to be a star (Andersson, 1994). After Herschel 'discovered'that Uranus was actually a planet, however, astronomers began 'seeing' a planet where previously they saw a star. ln order to explain this, Kuhn (1970a) referred to psychological experiments that showed the same drawing being seen in different ways by different observers. A particularly famous example was one discussed byWittgenstein (1953), which could be seen as either a duck or a rabbit. According to Gestalt psychology, at some point the observer 'sees' the hitherto unseen second image and a 'switch' occurs whereby the second image is the one observed from there on. ln a similar way, Kuhn suggests lhal: "what were ducks in the sclenfl'sts world before the revolution are rabbits afterwards". Whereas, before Herschel astronomers saw a star, afterwards they began seeing a planet. Thus, observation itself is theory laden.

As we have already stated, whilst the logic of Modus Tollens is beyond dispute, for the conclusion to be true the premises must also be true. However, it seems that test statements (Premise 1) suffer from a lack of crucial experiments and that the empirical basis of the falsification (Premise 2) suffers from the theory ladenness of observation. Accordingly, the entire falsificationist account begins to run into insurmountable difficulties.

140 These difficulties are compounded by the embarrassing fact that if the falsificationist account had been strictly adhered to in practice, then those theories generally regarded as being among the best scientific theories would never have been developed because they would have been rejected in their infancy (Chalmers, 2000). For example, in the early years of its life, Newton's gravitational theory was falsified by observations of the moon's orbit. lt took almost fifty years to deflect this falsification on to causes other than the theory (Chalmers, 2000). Later in its life, the same theory was known to be inconsistent with the details of the orbit of the planet Mercury. Scientists did not abandon the theory for this and it turned out that it was never possible to explain away this falsification - yet the theory remained the dominant scientific worldview for hundreds of years (Chalmers, 2000).

There are numerous other examples of scientists resisting falsifying evidence in support of their own theories, including such revolutionaries as Copernicus, Maxwell and Bohr (Feyerabend, 1975). ln fact, Kuhn (1962; 1963; 1970a;1970b) argues that anthropological studies suggest that scientists rarely, if ever, try to falsify the dominant theories of their disciplines. Rather, these theories become the contextual certainties for the discipline (recall

Section 2.1.2'¡ and are only ever rejected when a new theory comes along that 'f its' better with the whole inchoate set of concepts, observations, definitions, presuppositions, experimental results and connections to other theories that make up a dominant paradigm. As such, the psychology of research rarely ever matches the Popperian logic of research. lndeed, Kuhn (1970b; 1977; 1998) goes so far as to suggest that the history of science casts serious doubts on Popper's argument that science progresses cumulatively towards truth. According to Kuhn, new theories often solve problems associated with old ones but introduce different problems at the same time. As such, there are usually plusses as well as minuses in every instance of theory change.

Given the logical and historical inadequacies of falsification, it is reasonable to conclude that theory choice is typically made on grounds that include things other than strict logical deduction and empirical testing. This view, as the reader may recall, was held by the pragmatists, William James and John Dewey, at the turn of the 20rh century and has received much recent attention within the philosophy of science by the so-called sociologists of science. Accordingly, it is to the sociological view that we now turn, beginning with its foremost exponent - Thomas Kuhn.

2.2.4 Thomas Kuhn and The Socio-Historical Account of Science

"Science is fundamentally a social undeftaking."

- Thomas Kuhn

141 Notwithstanding its initial success, Popper's criterion of falsification has met an increasing number of difficulties and, as such, has largely been abandoned. To begin with, the ironical observation that if Popper's method had been strictly adhered to, the most celebrated scientific theories would have been falsified (and thus discarded early on in their development) has been used to question the historical of falsification (not to mention the sociological workability). Secondly, the Quine-Duhem thesis has highlighted the difficulty in determining which part of the 'web of assumptions' is falsified by a clash with observation, which has, in turn, undermined the applicability of the logic of Modus Tollens to science. No theory, it seems, faces the 'tribunal of observation' in isolation. Finally, even the validity of the tribunal itself has been called into question, with the theory ladenness of observation thesis throwing doubt on the supposed objectivity of observation. lt seems, therefore, that Popper's account of science fails to grasp the full complexity of the mode of development of major scientific theories. Since the 1960s it has been common to conclude from this that a more adequate account of science must be anthropological as well as logical. Such an account would seek to understand the psychological, social and cultural frameworks in which the scientific activity takes place. One of the reasons for this stems from the history of science. Historical study reveals that the evolution and progress of major sciences exhibit a social dimension that is not captured by either the verificationist, positivist or falsificationist accounts (Chalmers, 2000).

The sociological view burst onto the intellectual scene with the publication of Thomas Kuhn's book lhe Structure of Scientific Revolutions (Kuhn, 1962). According to Kuhn, "history, if viewed as a repository for moie than anecdote or chronology, coutd produce a decisive transformation in the image of science by which we are now possessed". The common image of science until Kuhn was one in which science was progressing cumulatively towards greater truth and mastery of the world around us. However, Kuhn derides this view as amounting to: "little more than a tourist brochure". According to Kuhn, the cumulative view of science fails because the selection of what theories constitute part of the cumulative historical narrative is always generated by present science and only those elements of past science that lead to present science are included (Hoyningen-Huene, 1993). As such, the cumulative view underspecifies the difference between older scientific worldviews and present ones. Kuhn's account of science, therefore, was developed as an attempt to accurately reflect the major differences between older and present scientific worldviews and, thereby, keep philosophy of science in line with the history of science. The key features are the emphasis placed upon the revolutionary nature of scientific progress and the role played by the sociological characteristics of scientific communities during what he terms 'normal science'. Kuhn's picture of the way science progresses can be summarised by Figure 6 below.

142 Pre-science

Adoption of a single paradigm

Normal science

Adoption of a new Acc u m u I at i on of f al s ify i n g paradigm evidence

Revolution Crisis

ElÍoñs to resolve crisis (may resemble pre-science)

Figure 6: Scientific Progression accord¡ng to Thomas Kuhn's Historical View

The disorganised and diverse activity that precedes the formation of a science (pre-science) eventually becomes structured and directed when a single parad¡gm becomes adopted by the scientific community. A paradigm is composed of the general theoretical assumptions that the members of a particular scientific community adopt. lncluded within these assumptions are an array of problems considered important to investigate, theories about underlying structure and causal relations, theories that interpret the underlying structure empirically and methodological guidelines. Workers within a paradigm practice what Kuhn terms normal science. These workers will articulate and develop the paradigm in their attempt to account for the behaviour of the phenomena relevant to their discipline. ln doing so, they will inevitably experience difficulties and encounter apparent falsifications, however they will usually deflect these falsifications onto some other aspect of the study rather than the prevailing paradigm. One reason for this is that the paradigm continues to yield theoretical understanding and empirical insights despite anomalies. However, there are always a number of non-intellectual factors that buttress the stability of paradigms to the point where they become 'doctrine'. One is professional training (standard textbooks are written assuming a particular paradigm).

143 Another is professional authority (recognised leaders in a field espouse the paradigm)148. Furthermore, proposals for research funding must be acceptable within a paradigmatic framework and professional publications largely screen out papers that breach the pattern of normal sciencelae.

However, with the accumulation of falsifying evidence and the realisation that the prevailing paradigm is exhausting its potential for new discovery, the community eventually becomes aware of a crisis emerging within the paradigm. The crisis is resolved when an entirely new paradigm emerges and attracts the allegiance of more and more practitioners within the discipline. Eventually, the original paradigm is abandoned and a'scientific revolution'takes place. The new paradigm, full of promise and not beset by any apparent difficulties now guides normal science until it too runs into serious trouble. Thus, science progresses not by becoming more objective or mathematically rigorous (as the positivist would have) but by becoming more imaginative.

The defining contribution of Kuhn's work is the revelation that science is fundamentally a socio-cultural undertaking (Kuhn, 1970b). Thus, the development of scientific texts, institutions, methodologies and most importantly, theories of explanation, is a social process subject to all the political, sociological, literary and anthropological influences as any other social process. ln contrast to Popper and the logical positivists, Kuhn is not much interested in partitioning science from non-science. However, when pressed, he has claimed that: 'the existence of a paradigm capable of suppofting a normalscience tradition is the characteristic that distinguishes science from non-science" and justifies the stalement by the observation that'normal scientists' are usually highly uncritical of the paradigm in which they work (Kuhn, 1962). lndeed, according to Kuhn, it is only by being uncritical of the paradigm that scientists are able to concentrate their efforts on the detailed articulation of their discipline (as seen through the paradigm). lt is this lack of disagreement over the fundamentals, which distinguishes normal science from pre-science, which, according to Kuhn, is characterised by the fact that 'There will be almost as many theories as there are workers in the fietd" (Kuhn, 1962)1s0.

Following Wittgenstein, Kuhn suggests that there is more to a paradigm than what can be tabled in the form of specific rules and directions. lf one tries to give a precise characterisation of a paradigm in the history of science, it always turns out that some work within the paradigm violates the characterisation (Kuhn, 2000). However, Kuhn insists that this state of affairs

lao A good example of authoritative knowledge is the hypothesised existence of black holes. When Vikram Chandrasekhar was a young scientist he performed a set of calculations suggesting that massive stars end their lives ¡n gravitational collapse (i.e. a black hole). The foremost authoriÇ in astrophysics at the time, Sir Arthur Eddington, could not accept such an absurd sounding conclusion and thought that something (never specified) must have been wrong with the theory - and he said so. Although Chandrasekhar had the better argument, Eddington's conclusion was almost universally accepted for a number of decades. to" The concept of peer rev¡ew within the sciences thus becomes a powerlul tool for perpetuating the dominant paradigm. Burke (2000) discusses this under the heading cultural inert¡a.

144 does not render the concept of a paradigm untenable. Even though there is no complete characterisation, individual scientists acquire knowledge of a paradigm through their scientific education by solving standard problems, standard experiments and eventually completing a piece of research under a supervisor who is already a skilled 'paradigm practitioner'. The aspiring scientist will not necessarily be able to give an explicit account of the methods and skills s/he has acquired because much of the knowledge will be tacit, in the sense developed by Michael Polanyi (1973).

Kuhn's conception of a paradigm may be seen as a reformulation of the 'theory-ladenness of observation'. Paradigms are like'reference frames'. Normally, our paradigm (reference frame) is taken for granted and mistaken for reality. lndeed, most people are not aware they are walking around carrying a frame of reference at all151. The reinforcing nature of such paradigms is discussed by de'Bono in his book lam Right, You are Wrong (de'Bono, 1990). ln it de'Bono states:

"Any [sciencel sefs out a scaffold for perception which permits us to seek data which will reinforce [the paradigm]. ln all these cases we see a broad type of circularity taking effect" (de'Bono, 1990).

The mere existence of unsolved puzzles within a paradigm does not constitute a crisis. lndeed, Kuhn recognises that paradigms will always encounter anomalies and apparent falsifications. lt is only under special sets of socio-cultural conditions that these difficulties can develop in such a way as to undermine confidence in the entire paradigm. According to Kuhn, an analysis of the characteristics of a crisis period in science demands the competence of a psychologist and sociologist as well as that of a historian (Kuhn, 1970b). A crisis becomes serious when a rival paradigm appears. The new paradigm will be incompatible with the old one. Each paradigm will usually regard the world as being composed of different kinds of things and thus regard different kinds of questions as being meaningful. Kuhn argues that there is a sense in which proponents of rival paradigms 'ãre living in different worlds" (Kuhn, 1977). ln this sense, he claims that rival paradigms are incommensurable (Kuhn, 1977). A scientific revolution occurs when the relevant scientific community, as a whole, abandons the old paradigm in favour of the new onetut.

tuo Kuhn otfers optics before Newton as an example of pre-science. tut And those that are aware that they carry a frame of reference (or paradigm) through which they understand the world are often unable to completely understand its effects due to its tacit nature. ls' The notion of 'paradigm', despite its usefulness in explaining the manner in which scientists learn and practice, exaggerates the extent to which a paradigm constitutes a single, coherent, closed system. Since Kuhn defines paradigms as closed systems (employing incommensurable languages), change can only ever be revolutionary - the ent¡re paradigm must be totally accepted or totally rejected. Popper criticises the idea of incommensurability in his essay Ihe Myth of the Framework, where he defines the myth as follows: "a ntional and lruitlul discussion is impossible unless the participants sharc a common lramework of basic assumptions, or, at least, unless they have agreed on such a framework lor the purposes of discussion" (Popper, 1994). Whilst I am in general agreement with Popper's critique of radical incommensurability, I am sceptical of Popper's characterisations of 'rational' and 'fruitful'. According to Popper, some frameworks are intrinsically superior to others and the 'fruits' of a 'rational' discussion between parties will be the discovery by one party of the hitherto unknown superiority of the alternative framework. According to Popper, paradigms can be compared (and hence ordered) on purely rational

145 Some aspects of Kuhn's writings give the impression that his account of the nature of science is a purely descriptive one, that is, he aims to do nothing more than to describe the practice of science. However, Kuhn insists that his account constitutes a theory of science because it includes an explanation of the function of the various components (Kuhn, 1963). Certainly, there is something descriptively correct in his idea that scientific work involves solving problems within a framework that is, in the main, unquestioned. A discipline in which fundamentals are constantly brought into question (as characterised by Popper's Conjectures and Hefutations (Popper, 1969)) is unlikely to make significant progress simply because principles do not remain unchallenged long enough for scientific work to be done. lndeed, it has been suggested that it is philosophy, not science, which comes closest to being adequately characterised by Conjectures and Refutations (Chalmers, 2000).

After the publication ol The Structure of Scientific Revolutions. Kuhn was charged with having put forward a relativist view of scientific progress (Hoyningen-Huene, 1993). That is, Kuhn proposed an account of progress according to which the question of whether a new paradigm is better than the one it has replaced does not have a definitive (or absolute) answer, but depends on the values of the individual, group or culture that makes the judgement. Kuhn was clearly not comfortable with that label and in the postscript to the second edition he tried to distance himself from it (Kuhn, 1970a). However his insistence that science is intrinsicalfy sociological and that understanding science involves "examining the nature of the scientific group, discovering what it values, what it tolerates, and what it disdains" inevitably leads to relativism if it transpires that different groups value, tolerate and disdain different things. This, indeed, is how the constructivist school interpret Kuhn, developing his views into an explicit relativismls3.

Perhaps the most influential exponent of the relativist-constructivist school is , whose controversial account is outlined in his book Against Method: Outline of an Anarchistic Theory of Knowtedge (Feyerabend, 1975)154. According to Feyerabend, the

grounds. lt is this sort of rationality that Wittgenstein (1953) critiquedin Ph¡losophical lnvestigations, Rorty (1979) ¡n P_hilosophy and the Mirror of Nature and Feyerabend (1975) in Against Method. '* Social construct¡vism is the position that affirms that the prevailing scientific paradigm is a construction of the socio-cultural processes at work within the relevant scientific community (as opposed to a representation that has closer correspondence to reality than its predecessor). The beginnings of constructivism can be found in Kant's Copernican tactic of implicating the 'self in all seemingly objective representations of the world. However, whereas Kant assumed that all human kind wielded the same set of 'cookie cutters' on the dough, the constructivists think that different groups (sometimes linguistic, sometimes social, sometimes disciplinary) will wield different a prioriconcepts. Given that the mind's a pr¡or¡ categories differ from group to group, each group 'constructs' slightly different worldviews. Fudhermore, each worldview is relative to the a priorí concepts imposed and not absolute. Applied to science, social constructiv¡sts typ¡cally claim that electrons, muons, curved space-time etc all exist relative to a particular theory (or discipline) but do not exist relative to past theories (or do not exist relative to other disciplines) and can never be said to exist in any absolute sense. 1s Against Method was dedicated to lmre Lakatos as "friend and fettow anarchist". The implication being that Lakatos' attempts to find a rational basis for scientific progress through his methodology of scientific research programmes had failed and, accordingly, he had no other recourse but to adopt Feyerabend's 'epistemological '. Lakatos, for his part, was set to reply to Feyerabend's critique in the same publication (his part entitled For Method¡, however, his untimely death meant that the proposed joint work, For and Against Method, never eventuated.

146 goal of science is to expand knowledge. Movement toward this goal is facilitated by the rapid generation of theories of all types. Constraints that hinder the generation and consideration of theories must be confronted and removed. Methodological standards are one such source of constraint, they function to (falsely) legitimate some theories and inhibit the consideration of others. The culmination of Feyerabend's case against method is that there is no scientific method and that scientists should embrace methodological pluralism. According to Feyerabend, history reveals that if there is a single, unchanging principle of scientific method, it is the principle lhal "anything goes" and the only appropriate response is the Maoist one: '/ef a thousand flowers blossom".

Feyerabend's thesis that there is no scientific method throws much of the work of all who have come before him in the philosophy of science into serious question. As we have seen, one of the principle problematics within the discipline has been the issue of demarcation: the idea that science is demarked from other forms of knowledge by certain criteria. The verificationists and positivists partitioned science from non-science by the existence of confirmatory empirical evidence (verifiability), the falsificationists, by the possibility of refuting evidence (falsifiability) and Kuhn by the existence of a paradigm able to support a 'normal science'tradition (agreement). ln opposition to all who have come before, Feyerabend denies that science is a privileged form of knowing and sees it simply as the sacred superstit¡ons of recent Western culture (Laudan, 1996). The truth, he suggests, is that:

"Science is much closer to myth than a scientific philosophy is prepared to admit. lt is one of the many forms of thought that have been developed by man, and not necessarily the best. lt is conspicuous, noisy, and impudent, but it is inherently superior only for those who have already decided in favour of a ce¡tain ideology, or who have accepted it without ever having examined its advantages and its limits'(Feyerabend, 1975).

Moreover, if the aim is to progress science through Kuhnian revolution, then according to Feyerabend, we must be willing and able to generate competitor theories to the ruling one. Scientific progress is thus enhanced by theoretical pluralism as well as methodological pluralism. According to Feyerabend, by defining science as an activity that is able to support a 'normal science' tradition, Kuhn seems to be legitimating uncritical dogmatism. Feyerabend labels this the principle of tenacity - the idea that paradigm practitioners tend to tenaciously defend the dominant theories of their disciplines in the face of seemingly falsifying evidence. ln contrast, Feyerabend suggests that science adopt the principle of proliferation (the idea that scientists should be encouraged to continuously generate new theories) at the same time as the principle of tenacity. Whereas Kuhn suggests that tenacity and proliferation should govern different phases of research (normal science and pre-science), Feyerabend recommends that they operate at the same time. Thus, theoretical pluralism becomes a feature of normal science.

147 Whilst Feyerabend is unashamedly relativist-constructivist in regard to theory choice, Kuhn's writings contain two incompatible strands, one relativist-constructivist and the other not. This opens up two possibilities within Kuhnian interpretation:

1. To ignore the relativism and rewrite Kuhn in a way that is compatible with some overarching sense in which a paradigm can be said to constitute progress over the one it replaces.

2. f o embrace the relativism within Kuhn's writings and develop a view of scientific progress that looks beyond the cumulative perspective on theory choice.

The first path, which may be termed the logical interpretation of Kuhn, suggests that a purely rational philosophy of science is possible. Paradigms can, therefore, be compared using strictly logical criteria (i.e. it is possible to generate an algorithm for theory choice) and, as such, the prevailing paradigm is always an improvement on its predecessor (in that it is closer to how the world truly is)155. This is the path of tmre Lakatos and other critical rationalists who associate science with a neutral search for truth. According to the critical rationalists, the products of science are legitimated with respect to objective criteria such as their degree of correspondence with the absolute truth of how the world really is. Such an interpretation must, therefore, legitimate its claims to truth (as correspondence) with respect to ontology (the nature of the 'real', mind-independent, world).

The second path, which may be termed the sociological interpretation of Kuhn, questions the enlire attempt by philosophy of science to construct an algorithm for theory choice. Rather, it suggests that a purely logical account of scientific progress is impossible. According to the sociological interpretation, if the replacement of one theory by another requires some sense of cumulative progression, then the bulk of scientific practice must be considered irrational because scientific communities routinely accept new theories that could not explain everything that the old theory could. Moreover, the sociological interpretation suggests that a cumulative view of progress cannot be sustained because different paradigms are either wholly, or partially, incommensurable. The prevailing paradigm, therefore, is as much an emergent property (or social construction) of the disciplinary socio-culture as it is an obvious enhancement over its predecessor. This is the path taken by Paul Feyerabend and other sociologists of science who associate science with a community of inquirers. According to the sociologists of science, the products of science are legitimated with respect to subjective criteria (such as their usefulness, degree of acceptance, simplicity, efc) or with respect to

15s According to the logical interpretation of Kuhn, the replacement of one theory by another is always progressive in the sense that the new theory can explain everything that the old theory could and more besides.

148 relat¡ve criteria (such as their degree of coherence with other accepted theories)156. Such an interpretation is agnostic about ontology (the nature of the'real', mind-independent, world).

Given that the logical and sociological interpretations of Kuhn are tied up with understandings of truth, ontology and the relationship of both to scientific knowledge, these issues will be explored in further detail in Chapter 2.3.

2.2.5 Conclusions: The lllegitimacy of the Honorific 'Science'

"Science plays its own game; it is incapable of legitimating the other language games ... But above all, it is incapable of legitimating itself, as speculation assumed it could ... [this] is changing the meaning of the word knowledge. A Game Theory specialist whose work is moving in this same direction said it well: 'wherein, then, does the usefulness of Game Theory Iie? Game Theory, we think, is useful in the same sense that any sophisticated theory is useful, namely as a generator of ideas'."

- Jean Francois Lyotard

The reader will note that this Chapter has yet to suggest a precise characterisation of 'science'capable of overcoming the problems associated with the characterisations that have thus far been presented. The description of science that seems to be emerging, therefore, may be objected to on the grounds that it is too vague. Part of the response to that charge is to admit that it is vague, but to argue with Chalmers (2000) that a lack of a precise definition of science (and associated demarcation between science and non-science) is not a weakness but one of the real strengths of the emerging picture. As Midgley (2000) has stated '1,ve cannot know the exact relationship between knowledge, the language we use to frame knowledge and reality". As such, any account of the relationship between scientific theories, the methodologies we use to generate those theories and the world that those theories are intended to be about must contain a degree of uncertainty.

One of the weaknesses of pre-Kuhnian accounts of science is that they assume that there is a single category 'science', and that various domains of knowledge (e.9. physics, biology, psychology, management, history, etcl or that various methods of inquiry (e.9. controlled experimentation, mathematical modelling, critical theory, hermeneutical studies, efc) either come under that category, or do not. As we have seen, the project of philosophy of science has, to date, been unable to establish a single categorisation that can come to terms with the various methods of investigation and socio-cultural processes at work within even the most

1uu David Bloor and lhe Sociology of Knowledge group at the University of Edinburgh have explicitly pursued the sociological interpretation of Kuhn. The sociologists of knowledge claim that science, as a whole, does not have logical grounds for theory choice. The research from Edinburgh has included personal, professional, social and

149 unambiguous 'scientific' disciplines. lt seems, therefore, that no one (not even the most advanced scientistic apologist) can achieve consensus on how to separate the scientific sheep from the non-scientific goats. As Chalmers (2000) points out:

'"There is no general account of science and scientific method to be had that applies to all sciences at all historical stages in their development."

Science, it seems, is a mixed bag and could possibly only ever be seen through Wittgensteinian lenses - as a family of activities with various similarities. Many features are common to many disciplines, but no set is definitive. Moreover, no process of inquiry, thus far suggested as the criterion of scientificity, has been able to withstand serious criticism. As such, it seems that the boundary between 'science' and 'non-science' is vague, subjective and value-laden.

Despite this state of affairs, however, science (as opposed to non-science) has achieved considerable prestige in contemporary society. lndeed, the term 'scientific' is commonly used as an epistemic honorific. However, if science cannot be paditioned from non-science, then the use of such an honorific is hard to justify157. lndeed, the very idea of there being a thing called'science' comes into question.

So where then does legitimacy reside?

Certainly not in acclaiming (or denigrating) items of knowledge because they conform (or don't conform) to some homespun criterion of scientif icity. As Lyotard (1979) has argued:

"Science plays its own game; it is incapable of legitimating the other language games But above all, it is incapable of legitimating itself, as speculation assumed it could".

Each area of knowledge must, therefore, be analysed separately by investigating its aims, the methods used to accomplish these aims and the degree of success achieved. lt is contended here that any area of knowledge (whether traditionally categorised as scientific, or not) should stand, or fall, by the degree to which its methods have been able to achieve the aims it has set for itself and whether these aims are in anyway useful or interesting. Such a position stands in direct contrast to the narrative that modern science has traditionally used to legitimate itself - that science is able to uncover (or approximate) the truth of how the world really is. Accordingly, it is this narrative that is the topic of the next Chapter.

political interests as part of the set of factors used for theory choice. lndeed, they have produced an imposing set of historical case studies, which they claim illustrates this (Bloor, 1 991). tut Moreover, the use of the adjectival 'scientific' knowledge is seen for w at it ¡s - little more than a scientistic illusion.

150 2.3 Scientism and the Quest for Certainty

"And you shall know the truth, and the truth shall set you free."

- Jesus of Nazareth

2.3.1 lntroduction: The Science Wars and The New Narrative of Legitimation

'"Truth is severe, hard, demanding."

- Plato

ln the previous Chapter it was shown that serious inquiry into the nature of science undermines much of the authority that the word has come to enjoy within the modern era. Serious doubts have been cast on the status of such scientific icons as the belief that:

1. Scientific laws and theories are verifiable.

2. Science is based on objective facts.

3. Rigour (mathematical representation) is able to confer objectivity onto subjective observations.

4. Scientific laws and theories are falsifiable.

5. Scientists are neutral and happy lo "dump their whole cartload of beliefs the moment experience is against them".

6. lt is possible to partition science from non-science by some process of inquiry that may be termed'the scientific method'.

7. lt is always possible to compare competing theories on purely rational grounds

These attacks, together with the subsequent retaliation by scientists (and philosophers of science), have been collectively labelled the 'science wars' by observers and commentators (Gross & Levitt, 1994; Ross, 1996; Sokal, 1996; Sokal & Bricmont, 1999; Holton,2000; Sardar, 2000; Ashman & Basinger, 2001; Parsons, 2003). lndeed, much of the philosophy of science literature over the past 50 years has been dedicated to this broad ranging dispute. lt

151 seems, however, that these 'wars' are now coming to a close (at least amongst the philosophy of science community) and that a cease-fire has been negotiated by both sides. Kuhn and his followers have relaxed their definition of incommensurability such that it does not entail indiscussability, whilst most of their critics have conceded that the sociological interpretation of theory choice is well founded and cannot easily be dismissed. Moreover, it seems that everyone has despaired of ever being able to partition science from non-science.

Accordingly, much recent attention has moved from method to metaphysics. From attempting to methodologically formulate (or deny the validity of) algorithms for theory choice (and thus partition science from non-science) to attempting to show (or deny) that science can, nonetheless, be said to be progressing closer towards the truth of how the world really is. This is the new narrative of legitimation of modern science. lt is different to the old one in that it wanders into metaphysical questions about the nature of truth itself (the old narrative pretended that science was free from metaphysics). ln the previous Chapter, we examined the old metanarrative and critiqued the 'will to methodology' that underpins it. Here we examine the new one and hope to critique the'quest for certainty'associated with it.

Much of the literature on truth is discussed within the context of the ontological positions 'realism'and'anti-realism'(see Alston, 1979). However, it is argued here that this categorisation leads to an impasse and disguises the fact that truth is not solely an ontological concept, but a concept that brings together various aspects of ontology, epistemology and linguistics. Accordingly, this thesis recontextualises the various theories of truth, with the aim of breaking the impass between realism and anti-realism. lt does so by distinguishing between those theories of truth that rely on a particular ontological condition to obtain (what this thesis calls the ontological theories of truth) and those that do not (what this thesis calls the anthropological theories of truth).

2.3.2 The Ontological Theories of Truth

'"Though anti-realism may seem to be everywhere ... Australia, isolated and out of the loop evolutionarily, continues as a stronghold of realists and marsupials."

- David Stove

2.3.2.1 Realism and Truth

'Whatever there is, is what is, regardless of how we think of it."

- W. Alston

152 The term realism is a relatively recent invention. However, following Wittgenstein's idea of Ïamily resemblance', it is possible to see the doctrine within the literature for about as long as there has been a literature to speak of158. lndeed, a brief survey highlights the fact that there is no one doctrine of realism, but a range of related positions that may, or may not, bear the appellation. Therefore, it is not particularly realistic to expect the reader to understand exactly what I have in mind when I use the term - a description of this Ïamily resemblance' is called for.

To begin with, realism is an ontotogicat position, not a theory of truthls'. Furthermore, ontological realism is almost universally composed of two inter-related claims:

1. That something exists independent of any mind, subject or observer. 2. That such and such an entity can be said to exist in a mind-independent manner

The first claim constitutes the minimal realist position: the belief that something exists independently of the mind. lf the writ of realism were to end here, all but the most dedicated idealists would subscribe to it160. lndeed the position would be so general that it would probably be objected to on the grounds that it was vacuous and provides us with no ontological insight whatsoever. Of more interest, however, is the second realist claim; which specifies the nature of the entities that realists are committed to. lt is this claim that distinguishes the different realisms from each other. For example, there are realists who hold that only observables exist (i.e. appearances or perceptions), others that hold that only physical entities exist (whether they can be observed or not) and still others who hold that unobservable and non-physical entities exist (such as natural laws or mental and spiritual entities).

The relationship between realism and truth is a complex one, with multiple divergent viewpoints. Ellis (1985) claims that realism is based on correspondence truth, stating that: "most realisfs, see acceptance of a correspondence theory of truth as essentlal to their position". Vardy (1999) conf uses the two by def ining realism as the correspondence theory of truth, stating: "realism is the theory of truth that claims a statement ìs true if and only if it corresponds to the state of affairs that it attempts to describe". Devitt (1984), on the other hand, completely separates the two by claiming that realism has nothing to do with truth, stating: "what has truth to do with Realism? On the face of it, nothing at all. lndeed, Bealism

tut Fudhermore, for as long as there has been a form of realism, there has been a corresponding scepticism about its scope, mandate, applicability or verac¡ty as a justified position in the first place. The current debate in the philosophy of science indicates that, in this regard, nothing has changed over the course of the past 3000 years. tut Ontology (literally, the study of being) is a network of philosophical issues connected to the questions of: what kind of things can be said to really exist in contrast to what kind of things are only myths or illusions. tuo lndeed, many positions that are given the label 'idealism' actually accept the minimal position (for example Kant's Transcendental ldealism). Those that do not (the absolute idealists) are committed to the notion that (ontologically speaking) the real world is somehow mind-dependent. Of these, there ¡s an ongoing dispute as to whether the mind at ¡ssue is transcendent to the natural world, immanent with the natural world, the collective social consciousness or simply some individual mind.

153 says nothing semantic at all ... does correspondence truth entail realism? lt does not". Kirkham (1995) distinguishes between'small r'realism (an ontological position) and'big R' realism (a theory of truth). Thereby giving the same phenome to two different doctrines (associated with two different philosophic problems), and claiming, somewhat absurdly, that it is possible to be "a Realist (about truth) and some kind of antirealist (ontotogically)"'u' .

As has been stated earlier, realism is first and foremost an ontological position, not a theory of truth162. Truth, on the other hand, is somewhat more elusive. The elusiveness arises because of the difficulties of separating out several highly interconnected questions163. Such questions include: what is the (metaphysical) essence of truth? What can be a truth bearer? What are the necessary and sufficient conditions for truth? How do we know if something is true? And, what is understood when we use the word 'true'? This network of interconnected questions has the potential to span several philosophic domains, including ontology, epistemology, linguistics and .

It is clear that a person's ontological position will, undoubtedly, influence their attempt at answering the above questions (a realist, typically, requiring different characteristics of a theory of truth to an idealist). Thus, many theories of truth betray their realist origins (eg the various correspondence theories). However, what is not as obvious is that attempts at answering the above questions can also lead to a change in ontological persuasion. lndeed, this is exactly what has happened with one of the most respected realists of the late twentieth century, Hilary Putnam. Putnam's (1981) recanting of realism, which is described in his book Realism, Truth and History, was precipitated by a number of difficulties he encountered when attempting to elucidate the position he had defended for so long. Specifically, Putnam argued that it is impossible to justify any theory of truth that is based on realist assumptions and, therefore, realism must be abandoned. lt is clear from the Putnam's example that struggles with truth are just as likely to influence ontology, as ontology is likely to influence truth. Thus, ontology and truth are recursively related.

Chalmers (2000) has written that: 'lhe theory of truth most conducive to the needs of a realist is the so-called correspondence theory of truth". This is the position taken here. The ontological realist, who believes in a mind independent'real world', usually requires a certain ontological condition on the truth of a truth bearer. This condition is variously described as 'a certain (mind-independent) state of affairs obtaining' or that the truth bearer 'corresponds to certain (mind-independent) facts' or that the truth bearer 'correlates to certain (mind- independent) facts'. ln this Section, it is argued that theories of truth that ¡ggþ a particular ontological condition to obtain should be collectively labelled ontological

161 Such a position would have a person who denies the existence of any 'real world' attempting to ascribe truth to statements based on their correspondence to this nonexistent real world. tu' Notwithstanding attempts to define realism as a theory of truth (Kirkham, 1995; Vardy, 1999). tut Notwithstanding Devitt's (1984) remark hhal'. "real¡sm has nothing to do with truth".

154 theories of truth16a. Such theories could theoretically be realist or idealist. However, to the author's knowledge there has never been a theory of truth proposed that explicitly requires an idealistic ontology (truth being very much a preoccupation of realists). Thus, for the time being, ontological truth may be thought of as being synonymous with realist-ontological truth. As shall become evident later, if we structure our thinking along these lines it may be possible to avoid a number of the difficulties surrounding the current debate.

2.3.2.2 Russell's Correspondence Theory of Truth

'The truth is one, but error many."

- Parmenides

Bertrand Russell claimed that there is a "structural isomorphism" between truth bearers and the ontologicalfacts to which they correspond (Russell, 1906; 1912). Thus, for a belief to be true its structure must 'mirror', 'map' or 'correspond' to the structure of the world. As Russell (1906; 1912) has expressed (with characteristic simplicity):

'When a belief is true, there is a complex unity, in which the relation which was one of the objects of the belief relates the other objects ... On the other hand, when a belief is false, there is no such complex unity composed only of the objects of the belief ... Thus a belief is true when it corresponds to a certaín associated complex, and false, when it does not".

According to Russell (1906; 1912), a belief is a relationship between four different things:

1. The Subject (person who has the belief). 2. The Active Object (The thing that the believer thinks is doing something to something else). 3. The Passive Object (The thing that the believer thinks is having something done to

¡r). 4. The Object Relation (The relation that holds between the active and passive objects).

The belief lhal "(active object) x has (object relation) R to (passive object) y" is true, therefore, if and only if it is a mind-independent fact that "x does have relation R to y". Thus, truth

lua Ontological theories of truth are those that rely on an a priori commitment to a particular ontology. For example, Russell's and Austin's correspondence theories of truth (which both require a particular ontological state of affairs to obtain) are ontological theories of truth. As we shall see, according to this classification, there are several theories of truth that do not require an a priori ontology, but nevertheless may be consistent with a realist ontology. These are termed the anthropological theor¡es of truth because they are not reliant on correspondence with a mind- independent 'real world' but on certain anthropologically created conditions obtaining. lt is the 'consistency with' minimal realism, rather than 'reliant on' minimal realism, categorisation of anthropological theories of truth that is novel within this classification scheme-

155 involves a correspondence between two complex relations. The first is what Russell calls a belief (which is mind-dependent) and the second is what he refers to as a fact (which is mind- independent). Thus, a'fact' is a complex relation composed of each component of the belief, minus the subject.

Russell (1912) offers an example from Shakespeare's Othello (see Figure 7); namely Othello's belief that Desdemona loves Cassio. ln this belief Othello is the subject, Desdemona is the active object, Cassio is the passive object and 'loves' is the object relation.

The Subiective The Objective Complex Relation Complex Relation (Belief) (Fact)

Sub¡ect

The subjective complex Active Object relation (beliet) is true if and only if it corresponds

Object Relation Loves Loves

to the objective complex v relation (facts) v Passive Object

Figure 7: Russell's Gorrespondence Theory of Truth

Both Horwich (1990) and Kirkham (1995) object to Russell's correspondence theory on the grounds that it only allows for beliefs to be false when they are not related in the way in which the subject thinks they are. Therefore, it cannot handle beliefs where one or more of the objects that the subject thinks are related, does not exist. This objection highlights that Russell's correspondence theory is deeply rooted in a naiVe form of realism. It takes for granted that the existence (or non-existence) of any object is a universally agreed upon fact. Moreover, it does not allow for ideas or abstractions to be terms in the complex relationl65. Thus, Russell condemns truth to be only a property of statements about the 'real world' (whatever that is) and not a property of statements about relations of ideas.

16s Russell acknowledged this and attempted to solve the problem by his '', which states that sentences with non-referring terms are only abbreviations of sentences composed solely of referring terms. However, as Kirkham (1995) highlights, by adopting Russell's theory of descriptions, beliefs that are intuitively true turn out to be false based on lhe (a priorl¡ falsity of the reference relation.

156 2.3.2.3 Austin's Correspondence Theory of Truth

'"There is no reason whatsoever for the words used in making a true statement to mirror in any way, however indirect, any feature whatsoever of the situation or event."

- J. L. Austin

Austin's version of the correspondence theory rejects Russell's claim that truth bearers must mirror the world in a structurally isomorphic sense. According to Austin (1950), the truth bearer 'as a whole' corresponds to the state of affairs 'as a whole'. lf the state of affairs obtains, then the truth bearer is 'true'; otherwise it is 'false'.

By rejecting Russell's structural congruence, Austin is forced to explain the nature of the correspondence he proposes. Thus, he arrives at his well-known theory of conventions. lf the correspondence does not consist in structural isomorphism, then it must be "absolutely and purely conventional" (Austin, 1970). As Pendlebury (1986) notes, Austin's theory of conventions resolves some of the difficulties associated with Russell's (1906; 1912) correspondence theory. For example, Russell's assumption that every predicate structurally correlates to a property requires that the meaning of the predicate be tightly defined. Austin's (1950) conception, on the other hand, does not require that conventions be defined in such a tight manner. For example, the proposition 'Elizabeth's bike has a flat tyre' does not necessarily have a tightly defined truth-value, the flatness of a tyre being a matter of degree.

Austin's version of the correspondence theory involves a four-term relation between:

1. Statements (the information conveyed by a sentence)166 2. Sentences (the medium through which a statement is made) 3. States of Affairs 4. Types of States of Affairs

A statement is true when the sentence used to make it, by convention, describes a type of state of affairs that is the type to which the particular state of affairs obtaining belongs. That is, it is true when it describes a type of state of affairs that obtains and false when it describes a type of state of affairs that does not obtain. Kirkham (1995) offers the following diagram by way of example:

166 According to Austin (1970), the meaning of a statement is a matter of two types of conventions. First, there are descriptive conventions correlating sentences with types of states of affairs. Second, there are demonstrative conventions, correlating statements with histor¡c states of affairs.

157 is made by

refers to describes

is a member of

Figure 8: Austin's Correspondence Theory of Truth

Whilst Austin's version of the correspondence theory is more sophisticated than Russell's (in the sense that it denies structural isomorphism) it is still subject to many of the other objections to correspondence. These include objections to the notion of a mind-independent 'fact'or'state of affairs' (the correspondence theory presupposes both ontological realism and epistemic access to the real world) (Strawson, 1950; 1964; 1966; 1985; Dummett, 1976; 1978; Davidson, 1983), and objections to the nature of the correspondence relation itself (Black, 1948; Tarski, 1969; Field, 1982). ln summary, if truth holds of a sentence by virtue of its correspondence with 'facts', then we need an explanation of this 'correspondence' and of these 'facts'.

2.3.2.4 Tarski's Semantic Theory of Truth

'We regard the truth of a sentence as ifs correspondence with reality."

- Alfred Tarski

Alfred Tarski was a pure mathematician of considerable standing. However, among his non- mathematical 'hobbies', Tarski also made significant contributions to the fields of logic, philosophy and semantics.

Tarski's contributions to the theory of truth arose from his more general program of trying to provide respectability to the science of semantics. ln order to confer scientific merít on semantics, Tarski supposed that he needed to show that the discipline did not assume the existence of any abstract entities in addition to those already assumed by the physical

158 sciencestut. His general strategy was to define all semantic concepts (except satisfaction) in terms of truth, define truth in terms of satisfaction and satisfaction in terms of physical concepts, as shown in Figure 9.

Figure 9: Tarski's Hierarchy of Conceptual Abstraction

From the outset Tarski constrained himself by adopting a reductionist ideology, which assumed that the only way to legitimate the study of semantics was to show that semantic objects were directly related to physical objects. Further, to Tarski's 'reductionist' constraints, he also placed what seems to be an ontological constraint upon his theory of truth. This constraint, Tarski refers to as the 'material adequacy condition'. The material adequacy condition states that any good theory of truth must satisfy'snow is white is true', if and only if 'snow is white'. Tarski generalises this with his convention T, which states that: 'þ is true in L if and only if P".tffhere Pis any sentence of a language, L, to which the word'true'refers (P is often referred to as a T-sentence) and p is a name of this sentence.

With these ideological constraints in place, Tarski begins his semantic program by making the seemingly innocuous observation that truth is a property of sentences (specifically T- sentences). Thus any general theory of truth must be cognisant of the structure of all possible sentences that have truth-value. Such sentences can be constructed by way of the operators 'not','and','or', and'if ... then'. Thus, "a is not b", "if athen b", "if athen b and not c"... etc are all possible sentences that have truth-value. lndeed, as Tarski discovered, there are an infinite number of such T-sentences.

tut As we have seen (Section 2.1.2), this represents a reductionist ideological position in regard to ontology (known as physicalism). lnterestingly, whilst most contemporary commentators accept that Tarski has captured something fundamental in regard to truth, they deny that his physicalist program was successful. Thus Tarski is thought to have failed in his main goal, but succeeded in solving other (and in Tarski's mind, minor) problems.

159 One of Tarski's most well known contributions was to make use of the notion of recursion and propose that truth be defined as the conjunction of all possible T-sentences. However, despite its initial success, this definition soon ran into insurmountable difficulties. As Tarski soon discovered, such a definition could never be comprehensive because T-sentences may be constructed from components neither of which is itself a T-sentence (and therefore neither of which can be characterised as true or false)168. Thus, his attempt to define the truth of a T- sentence in terms of the truth of its component T-sentences failed.

Not to be deterred, Tarski continued by developing perhaps his most significant creative contribution toward the theory of truth. Following the failure of his initial attempt, Tarski argued that since the property of truth is not possessed by all sentences, there must be some other property that is possessed by all sentences which could subsequently be used to characterise the truth of it. This property, Tarski termed 'satisfaction'. Satisfaction is a relation between an object and an expression - thus it is a semantic conceptloe. According to Tarski, a sentence is true, if and only if, it is satisfied by all objects. For example, consider the open sentence 'X is white', where the object is snow. Snow satisfies 'X is white' if and only if snow is white. Furthermore, snow satisfies 'X is white and X is cold' if and only if snow is both white and cold. Alternatively, snow satisfies 'X is not white' if and only if it fails to satisfy'X is white'.

The definition of satisfaction is different for every predicate in the language. Therefore, Tarski's method only works for languages with a finite number of predicates, such as mathematical and logical languages (Putnam, 1978). Unfortunately, as Kirkham (1995) has pointed out, Tarski's conception of truth is not applicable to natural languages and hence may not be used to define the truth of beliefs expressed discursively. Tarski himself understood this and argued that it is impossible to define truth for any natural languagetto.

This general lack of applicability of Tarski's conception (even inside philosophy - a discipline not especially known for its practical credentials) raises a whole raft of objections. Haack (1978) objects to Tarski on the grounds that the semantic concept pre-supposes bivalence. That is, that every sentence is either true or false. Haack rejects bivalence, claiming that sentences may have an indeterminate truth-value, or no truth-value at all171. O'Connor, (1975) argues that Tarski's conception is both circular and uninteresting, stating:

"How do we know that, for example, snow satisfies ,x is white'is true? ... is not an explanation of truth and falsity in terms of satisfaction plainly circular? To this objection, a defender of the

168 For example, an open sentence (an expression that is grammatically complete, but has one or more variables in place of nouns) may be the component of a sentence. r6e Thus, Tarski labelled his theory the semantic conception of truth. tto Whilst Tarski denied the possibility of applying his techniques to non-artificial languages, Davidson (19S4; 1990) has claimed that Tarski-like techniques can be used to provide a theory of truth for natural languages as well. Moreover, Davidson argues that a Tarski-like theory of truth is also a theory of meaning for natural languages. ltt See Pendlebury's (1986) objection to Russell's correspondence theory.

160 semant¡c theory of truth will reply that the theory is designed simply to give a clear and precise definition of truth. lt does not pretend to offer a method for determining which particular sentences are true and which are false. The reply is justified but it does point to a feature of the theory that seriously limits its philosophical interest " (O'Connor, 1975).

Such objections lead to perhaps the most well known criticism of Tarski, that is, that his theory is entirely vacuous in the sense that it does not contradict any of the competing theories (Black, 1948; Pap, 1949; Quine, 1960; Sellars, 1963; Keuth,1978; Putnam, 1981; Walker, 1989; Rorty,1982). As Black (1948) states:

"Adherents to the correspondence, the coherence, or the pragmatist, Theories' of truth will all indifferently accept schema S [Convention T]."

The basis of the vacuity objection is that an a prioriontologicalcommitment will affect the way in which a theoretician will interpret such things as the "if and only if', or the ontological status of the 'fact' asserted on the right hand side of convention T. Thus, a realist may interpret the "if and only if'as 'if and only if, in the real world', whilst a non-realist may interpret it as something completely different, for example, 'if and only if, has the same degree of coherence'. To make matters worse, the ontological status of the 'fact' on the right hand side is left completely undiscussed by Tarski. Questions immediately arise about the mind- independence (or otherwise) of these facts, which the semantic conception of truth cannot answer. As such, the questioner is left to bring his/her own a priorionlological commitment to the semantic conception of truth. lf convention T is ontologically neutral, an anti-realist may accept T-sentences and at the same time reject realism. lndeed, Dummett (1978) claims that "there seems no obstacle to admitting the correctness of schema T" and then construes the statement on the right hand side in an entirely non-realist mannerttt.

There has been much disagreement regarding how Tarski's theory influences the realism / anti-realism debate. Some have argued that Tarski's semantic conception of truth is not, in any way, reliant on realism (Haack, 1978; Mackie, 1973; Keuth, 1978). Others have argued that Tarski was indeed comm¡tted to realism, and that his semantic conception is therefore a type of correspondence theory (Popper, 1974b; Sellars, 1963; Kirkham, 1995). More interesting is Field's (1974) claim that Tarski believed that his theory was neutral but that he was wrong, and in fact, it is not neutral, but intrinsically realist. Or Davidson's position(s), who once argued that Tarski's theory was a realist theory (Davidson, 1983), but has since

1t2 Dummett was actually referring to an anti-realist position in regard to the nature of mathematical entities, known as intuitionism. lntuitionism states that mathematical entities exist when and only when they are proved to exist. Or, in the language used thus lar: mathematical states of affairs obtain, iff they are proved to obtain". Here the /f is interpreted as "because". Such a position is consistent with Wittgenstein's well known remark that: 'Ihe mathematician is an inventor, not a discoverel'and with the work on in mathematics by De'Millo et al (1979).

161 recanted, claiming that it is not realist at all (Davidson, 1990). lndeed, Tarski himself has added to the confusion by variously describing his theory as realist:

'We regard the truth of a sentence as ifs correspondence with reality." (Tarski, 1936), and neutral:

'We may remain naive realists, critical realists or idealists, empiricists or metaphysicians - whatever we were before. The semantic conception is completely neutral toward all these

Lssues. " (Tarski, 1 944).

Tarski's apparent contradictions notwithstanding, it is clear that convention T implies that the truth of an expression depends on two (and only two) conditions:

1. What the expression means 2. How the world is.

The first condition implies linguistic relativity. That is, truth relative to the language employed (true in L1, true in L2etc). lndeed, Blackburn (1984) and Putnam (1985) both claim that Tarski is advocating relativity and that rather than defining absolute truth, Tarski actually defines something distinct from truth, called L- truth (truth in language 1, truth in language 2 ... etc). According to Blackburn (198a) and Putnam (1984), Tarski does not reveal anything about the intrinsic (or trans-linguistic) nature of truth.

The second condition of convention T implies ontological realism. Certainly, it seems reasonable to conclude that this is what Tarski believed of his theory, when he stated "We regard the truth of a sentence as its correspondence with reality" (Tarski, 1936). Whilst it is possible to construe convention T in an anti-realist manner (as Dummett (1976; 1978) has done), this is not what Tarski intended and therefore the resulting theory will be a Tarski-like theory, but not Tarski's.

Given the above, what is to be made of Tarski's claim of neutrality? Further reading reveals that Tarski made his oft quoted claim of neutrality within the context of epistemology, not ontology. Thus, Tarski actually claims that his theory is ontologically realist, but epistemologically neutral. As we shall see later, the failure to distinguish an ontological doctrine from an epistemic one has been the source of much confusion in the contemporary debate between realism and anti-realism.

ln Tarskian fashion, this thesis similarly argues for neutrality (Section 2.3.5). However, whereas Tarski argued for epistemic neutrality, this thesis argues for ontological neutrality.

162 That is, mind-independent, ontological, 'slates of affairs' can never be known, as we have no way of escaping the mind. Such a position leads to epistemic humility: the doctrine that it is impossible to have epistemic access to 'real (or ontological) truth'. But we get ahead of ourselves once again, as before we discuss epistemic humility it is important to first review the anthropological theories of truth.

2.3.3 The Anthropolog¡cal Theories of Truth

'What, then, is truth? A mobile army of metaphors, metonyms, and anthropomorphisms - in sho¡'t a sum of human relationè, which have been enhanced, transposed, and embellished poetically and rhetorically, and which after long use seem firm, canonical and obligatory."

- Friedrich Nietzsche

2.3.3.1 Scepticism about Ontology

"Between the idea And the reality Between the motion And the act Falls the shadow."

- T. S. Eliot

As has already been stated, there is no one doctrine of realism. Rather, there is a family of related positions. Yet for each realist position, it is possible to find a corresponding scepticism in regard to its scope, mandate, applicability or veracity as a justified position in the first place. Scepticism in regard to realism can, therefore, take many forms. Some sceptics argue that there is no 'real world' (absolute idealism), others that there is a 'real world' but we can have no mind-independent, epistemic, access to it (transcendental idealism) and still others that we cannot have mind-independent access to the 'real world', but that this lack of access can be overcome via following some 'objectifying' method (various bounded such as positivism, and phenomenalism) (Greco, 2000; Hookway, 1990; Rescher, 1980; Roth & Ross, 1990; Stroud, 1984).

Scepticism has had a long, tumultuous, and often difficult history in western thought, as both the Christian Middle Ages and post-Enlightenment Modernity have not taken kindly to its challenges. However, despite several attempts to dismiss scepticism, it has continued to

163 flourish, and when dogmatism has been at its greatest, the sceptical backlash has been equally severe.

Perhaps the first known sceptical writings are those of the pyrrhonists who argued that no position was more worthy of acceptance than any other. Sextus (1604.D - 210A.D.), in his Outlines of defines pyrrhonism as:

"An ability, which opposes appearances to judgements in any way whatsoever, with the result that owing to the equipollence of the objects and the reasons thus opposed, we are brought firstly to a state of mental suspense and next to a state of 'unperturbedness' or quietude".

Suspending judgement, for the pyrrhonists, meant living without dogma. Rather than denying appearances (as at a first reading it often appears, and as some commentators have appeared to argue) the pyrrhonists denied dogmatism in regard to any account of an appearance. The writ of their scepticism went no further. Specifically, it did not encompass the belief that there was no underlying 'real object' at the source of the appearance. As, Sextus (160A.D. - 2104.D.) statesl

'When we question whether the underlying object is such as it appears we grant the fact that it appears, and our doubt does not concern it but the account given of that appearance."

Thus, the pyrrhonists accepted the minimal realist position, that the external world exists

independent of the mind173.

Following the pyrrhonists, Carneades (2148C - 1298C) and Cicero (1068C - 34BC) founded the academic school, which adopted the pyrrhonist epistemic position whilst rejecting their philosophy of inquiry. ln particular, the academics argued that pyrrhonist scepticism should not lead to tranquillity (as the pyrrhonists had argued), but perplexity. Opposing accounts of appearances, according to the academics, should not lead to a state of mental suspense (often associated with inaction) but to a "sensible uneasiness", which should, in turn, stimulate inquiry. This uneasiness is what inspired David Hume, the most famous of the modern sceptics, to undertake his Ireatrse of Human Nature.

David Hume (1711-1776) is variously described as a naturalist, a sceptic and an empiricist. ln fact, he himself uses all three to characterise his work. His position is sceptical in so far as he argues that knowledge has nothing like the reliable foundations that the Cartesians claimedlTa.

173 The absolute idealist, on the other hand, argues that the 'external world'does not exist. According to the absolute idealist, the perceptions we have of the external world are just as likely to be caused by a dream, a spirit, or the artificial stimulation of our brains in a vat as they are by the actual exlstence of the world we think we inhabit. 174 lndeed, Cartesian scepticism is a sham! A matter of feigned doubt that eventually leads to the (supposed) reinstatement of everything that was doubted.

164 Hume begins his sceptical attack on Cartesian foundationalism by distinguishing between perception and thought. When we perceive something, we are aware of something immediately present to the mind through the senses. According to Hume, perception is unproblematic. We have no reason to be sceptical about the existence of (or the nature of) the objects of sense experienceltu. However, Hume is sceptical about thought. Specifically, thought that entails reasoning about past perceptions.

According to Hume, everything that we can think or reason about are either relations of ideas, or matters of fact. Relations of ideas are considered a príori and are true or false irrespective of observation, that is, they are necessary (Hume offers geometry and arithmetic as examples). Matters of fact, on the other hand, are not a priori, rather they are a posteriori and, as such, Hume refers to them as contingent. lt is these contingent matters of fact that Hume is sceptical about. According to Hume, all reasoning from past perceptions to matters of fact is based on the idea of cause and effect:

"AIl reasoning concerning matters of fact seems to be based on the relation of cause and effect ... Causation is the only relation that can be trac'd beyond our senses and informs us of existences and objects, which we do not see or feel." (Hume,1741).

The belief that one type of thing is the cause of another, cannot be discovered by reason alone. Rather, it is postulated after repeated observation, or in Hume's words: "repeated perceptions of one type of thing being followed by another type of thing". From this we infer that one is the cause of the other. However, as Hume (1741) points out, even reason, in combination with past perceptions cannot provide any assurance of cause and effect: 'how does the experience of events being consistently conjoined in the past license an inference to the claim that they will continue to be so conjoined in the future?" What is needed is an additional assumption that the future will always resemble the past, or in Hume's words that "the course of nature continues always uniformly the same". All inferences about matters of fact are contingent upon that principle. This leads Hume to ask "on what basis can we justify this assumption?" His response to this question reflects the scope of his scepticism. According to Hume:

'"There can be no demonstrative argument to prove it, for it is at least conceivable that the course of nature might change: what is conceivable is possible; what is possible cannot be demonstrated to be false; therefore it cannot be demonstrated that the course of nature will not change." (H ume, 17 41).

175 Accordingly, Hume sidesteps absolute idealism, together with the theory-ladenness of observation

165 Thus, we have no rational basis for believing the principle of proof by induction; it must be accepted on faith and, therefore, all beliefs about contingent matters of fact depend on an assumption that can never be known. Hume's argument, therefore, runs along the following lines:

1. All beliefs about contingent matters of fact depend for their justification on past observations together with the assumption that the future will always resemble the past.

2. But this assumption is an a priorimelaphysical belief that can never be justified.

3. Therefore, all of our knowledge of contingent matters of fact rests on an assumption that can never be known.

Hume is often described as a sceptic. lndeed, it is the self-description of his lreatr'se. However, it is important to highlight the scope and limits of his scepticism. Hume is not sceptical about reality - he is not an absolute idealist. Nor is he sceptical about perceptions - there is no discussion about the theory-ladenness of observationttu. So what, if anything, is Hume sceptical about? As has been highlighted earlier, Hume is sceptical about the idea that our knowledge of contingent 'matters of fact' has firm foundations. He, therefore, rejects the Cartesian notion that knowledge is a result of sound reasoning from indubitable foundations. The following quote f rom his Abstract is indicative:

'"The belief, which attends experience, is explained to be nothing but a peculiar sentiment, or lively conception produced by habit. Nonb ff,,'s all, when we believe any thing of external existence, or suppose an object to exíst a moment after it is no longer perceived, this belief is nothing but a sentiment of the same krnd"(Hume,1741).

Beliefs about the functioning of the external world of the senses (i.e. contingent knowledge of matters of fact) are not a result of reasoning from certain foundations but are the result of us making unknowable metaphysical assumptions about the world. Hume's scepticism, therefore, amounts to a critique of our reliance on metaphysical assumptions in constructing knowledge of matters of fact!

According to Hume, if left to themselves, our rational facilities would undermine any belief about matters of fact (Fogelin, 1993). lndeed, Hume claims that:

ttu ln the Treat¡se of Human Nature lherc is a section entitled Sceptic¡sm With Regard To The Senses, which at first glance appears to be a discussion on the theory-ladenness of observation. However, a thorough reading reveals that the focus of Hume's scepticism is the inductive reasoning from observations, which leads us to believe that 'the objects of our awareness can enjoy a continued and distinct existence". This is not to say that Hume is sceptical about the external world. Rather, he is sceptical that reason can provide a basis for our belief in the external world.

166 "Sceptical doubt arises naturally from a profound and intense reflection on those subjects, it always increases, the fafther we carry our reflections."

We are saved from utter scepticism, according to Hume, because our non-rational reliance on unprovable metaphysical assumptions overwhelms the doubts that reason attempts to force on us (Fogelin, 1993). This leads Hume to recommend a mitigated scepticism as a middle way between pyronnhism and naTVe realism; being a product of rational doubt and instinctual belief .

Of the various responses to Hume's scepticism, perhaps the most constructive (and definitely the most famous) was that of lmmanuel Kant. As we have seen (Section 1.2.4), Kant took Hume's scepticism seriously, stating that it was 'ã recollection of Hume [that] interrupted his dogmatic slumbef'. Following which, Kant set out to 'Tnstif ute a tribunal which would assure to reason its lawfulclaims, and dismiss all groundless pretensions".

Kant's inspired contribution was to adopt the lessons learned from Copernican astronomy and apply them to the problem of knowledge in general. The result, he argued, would be a "Copernican revolution in philosophy". Just as Copernicus explained the 'apparent' motion of the Sun in terms of the movements of the observer on earth, Kant explained our'apparently' independent knowledge of objects in terms of our mode of cognition. Thus, what appear to be wholly objective phenomena are actually partly subject constructed. According to Kant, the subject is implicated in all accounts of experience by way of the existence of a priori intuitions necessary for the very experience of objects (thus his scepticism goes further than Hume's and pertains to the very experience of objects themselves). Following Kant, the Cartesian dualism, of an internal knowing subject and an external object of knowledge, that had dominated modern science and philosophy, died! Accordingly, subsequent thinkers (the post- or neo-Kantians) began to explore the ways in which the inquirer constructs the world of their own knowle dg.t" .

The influence that David Hume and lmmanuel Kant have had on science and the philosophy of science cannot be understated. The problems they raised in regard to the naiVe realism that had dominated inquiry during the Enlightenment were to become the main problematics

177 Kant's idealist strategy immediately raises the question of how much of the object the subject is responsible for. His answer to this is that objects are subject-dependent only with respect to those of their features that conform to our experience; that is, only with respect to whatever makes them objects to us in the first place. The writ of his idealism e)dends no further and, specifically, does not extend to the existence of objects. Or, in Kant's words: "representat¡on in itself does not produce ¡ts ob¡ects in so tar as existence is concerned". Kant's ideal¡sm, therefore, concerns not the existence of objects but the propedies that we predicate of them by virtue of how we know them. To say that the subject constitutes the object (as far as existence is concerned) is equivalent to saying that objects are 'created' by our representations of them, a position that Kant explicitly rejects. Thus he calls his position transcendental idealism: indicating that it is not so much concerned with objects, but with our mode of cognition of objects. All of the above implies that the 'two worlds' ¡nterpretation of Kant is overly simplistic. A is not a ditferent object to its associated phenomenon. Rather the phenomenon is the that arises from the objective existence of noumenon. Or, employing the 'cookie cutter' metaphor: the dough (thing in itself) is independent of the cook (us). Hence it exists objectively. However, the cook imposes cookie cutters (a priori

167 facing philosophy of science for the next 200 years. Over this time, the literature has yielded at least four broad responses to their scepticism. These are discussed in what follows as:

1. Antagonism 2. Positivism 3. CriticalRationalism 4. Postmodernismttt

The first of these, antagonism, is a position that chooses not to engage sceptical arguments seriously. The antagonist claims that either sceptical arguments are self-refuting or they have no practical significance and, as such, are philosophically uninteresting.

The first of these claims; that sceptical arguments are self-refuting, is easily countered. According to the antagonist, in claiming that no one knows, the sceptics are themselves making a knowledge claim and thereby contradicting themselves. How do they know that no one knows, asks the antagonist? However, such a response misses the point that a number of historically prominent sceptical arguments make no obvious mistake. On the contrary, many sceptical arguments begin with the very same assumptions about knowledge and its legitimation held by the antagonist. They merely show how these assumptions inevitably lead to untenable positions. The sceptics' weapon, therefore, is cogent reasoning. S/he does not claim to 'know' in any'transcendental' sense of the word, but claims to 'show'the logical consequences of common assumptions about the nature of knowledge. ln this sense, the sceptic makes no self-refuting argument.

The antagonist, on the other hand, rarely attempts to engage the sceptic to such a degree that s/he is able to put forward any sort of defence to the charge of circularity. Rather, the antagonist prefers to reject outright the sceptics conclusions due to the sheer unbelievability of them. So committed to the traditional understanding of knowledge and its legitimation are they, that many antagonists have argued that surrender to scepticism is either psychologically impossible or that epistemology must begin with the a príorí assumption that scepticism is (somehow) wrong and therefore we should move on to more interesting problems. Such an attitude, however, amounts to little more than mindless dogmatism.

It is the contention of this thesis, however, that engagement with sceptical arguments is not only interesting but increasingly important. This is not because we should all adopt a pyrrhonist position of knowing nothing, but because sceptical arguments drive progress (Greco, 2000). They do this by highlighting plausible but mistaken assumptions about the

concepts) on the dough in order to create cookies (phenomena). Thus Kant turns out to accept the minimal form of realism in the same way that others who were supposed to be 'anti-realist', such as Hume and the pyrronhists do. 170 lncluding pragmat¡sm and post-structuralism.

168 modern understanding of the nature of knowledge and its legitimation. As a result, they drive us to re{hink many of the pre-suppositions that have dominated inquiry since the Enlightenment. As Greco (2000) suggests:

"Sceptical arguments are imporlant not because they might show that we do not have knowledge, but because they drive us to a better understanding of the knowledge we do have."

That is, sceptical arguments lead us to a position where we can begin to think past (or post) the modern understanding of knowledge and its legitimation. This has been the program of positivists, critical rationalists and postmodernists alike. ln the previous Chapter (2.2), it was argued that the positivism of the Vienna Circle and the critical rationalism of Karl Popper both attempted to respond to scepticism (of proof by induction) by reformulating our understanding of scientific method. However, both of these reformulations attempted to maintain a traditional understanding of scientific knowledge and, as we saw, succumbed to further sceptical attack. Accordingly, this thesis advocates a different approach - one that seeks to reformulate our traditional understanding of scientific knowledge (as a list of truths). This response is what this thesis terms the postmodern response to scepticism. However, before we begin to look at the postmodern response, it may be usefulto briefly re-visit positivism and critical rationalism.

As we have already discussed, the positivists accepted Hume's scepticism in regard to causality and therefore found no place for metaphysical (or structural) conjecture within science. This led the positivists to adopt a radical empiricism and reformulate the rationalist's narrative of legitimation of science by grounding it in perception (recall Hume's scepticism did not extend to perception). According to the positivists, we could only ever have knowledge of sense-datum. Adopting the Humean vocabulary in its entirety, they termed any inference about the unobservable structure of reality 'metaphysics' and derided it as 'cognitively meaningless'. The difficulty with this approach, as we have seen, is that it ignores sceptical arguments about the veracity of perception itself (such as Kant's transcendental aesthetic or the theory-ladenness of observation). Understanding this, the positivists attempted to overcome it by suggesting that objectivity could be 'conferred' onto the theory-laden objects of empirical inquiry by adopting a rigorous mathematical approach. As we have seen, this attempt failed and so too did the entire positivist program. ln particular, the positivists failed to develop an account of science that was at once capable of maintaining a modern understanding of the nature of scientific knowledge (as encompassing truth) and respond to sceptical argumerits ¡n regard to perception and thought.

One response to the failures of positivism, as we have seen, is that of the critical rationalists. Popper, perhaps the most influential of the critical rationalists, also accepted Hume's

'169 scept¡c¡sm in regard to causality and, accordingly, suggested that inductive inferences should have no place in science. This led Popper to adopt a radical fallibilism and reformulate the positivist narrative of legitimation of science by grounding it in progress. According to Popper, we can only ever have knowledge of deducible consequences. As such, we can never inductively verify scientific theories. However, by falsifying them and replacing them with ones that do not yield to falsification so easily we can assume that science is progressing closer and closer by approximation (or verisimilitude) towards the absolute truth of the matter. As we have seen, the difficulty with this approach is that it once again gives way to theory-ladenness. lf a theory clashes with observation, it is not necessarily the case that the theory is wrong. Moreover, sophisticated theories never confront Popper's 'tribunal of observation' in isolation. As such, they are always able to survive apparent falsification by deflecting it onto one or more auxillary hypotheses. As such, the critical rationalists similarly failed to develop an account of science that was at once capable of maintaining a modern understanding of the nature of scientific knowledge (as progressing towards truth) and respond to sceptical arguments in regard to perception and thought.

Whereas the positivists and critical rationalists clung to the great stories of legitimation of modernity, a growing number of thinkers have begun to despair of ever developing an account of science that is at once capable of maintaining a commitment to these narratives and respond to sceptical arguments at the same time. According to these thinkers, sceptical arguments are a pointer to fundamental difficulties associated with the traditional narratives themselves and, therefore, the attempt by positivists and critical rationalists to deflect this onto the methodological account of science misses the point. What is required is to completely reformulate our traditional understanding of knowledge and its legitimation. These scholars are sceptical about any attempt to confer objectivity upon knowledge. Kant's Copernican shift is used to argue for the theory ladenness of observation and subsequently for scepticism about sense data as well as causality and structural conjecture. lncluded within this group are the sociologists of knowledge (Bloor ef al), the so- called radical epistemologists (Kuhn, Feyerabend and Goodman), contemporary empiricists (Van Fraasen), contemporary pragmatists (Rorty and the later Putnam) and post-structuralists (Foucault, Derrida, Lyotard etc). Whilst they come from diverse intellectual backgrounds they may be collectively labelled as postmodern, in the sense already discussed. That is, they are sceptical about meta-narratives, in this case, the particular meta-narratives that modern science has traditionally used to legitimate its claim to objective knowledge and truth.

ln response to scepticism towards the narrative that science encompasses (or approaches a closer approximation to) correspondence truth, some have attempted to re-define truth so as to make it more attainable and less grandiose. Such re-definitions no longer rely on realism, or mind-independent access to the 'real world', but begin to associate truth with cefiain

170 anthropologically created conditions obtaining. These conditions are unashamedly defined in either subjective (usefulness, degree of acceptability, efc) or relative (degree of coherence with other beliefs) terms. Truth, according to these definitions, becomes radically anthropocentric. Accordingly, this thesis labels them, collectively, the anthropological theories of truth. They are not specifically anti-realist, as the literature often supposes. lndeed, they rarely say anything about ontology at all. lf pressed, proponents of these theories would probably claim that Kant has dissolved the traditional ontological question: fully real things are not objects that we can intelligibly seek knowledge of. Therefore, ontology is little more lhan "grasping for the wind".

It is to these anthropological theories of truth that we now turn

2.3.3.2 Pragmaticism

'"The opinion, which is fated to be ultimately agreed to by all who investigate is what we mean bY truth.'

- Charles Sanders Peirce

It is well known that Charles Peirce was the founder of the philosophic school of pragmatism. However, what is less known is that Peirce himself came to reject the label in favour of pragmaticism, when the former term was adopted by James, Dewey and others to label a different theory altogether. Peirce explains the situation as follows:

"lt has probably never happened that a philosopher has attempted to give a general name to his own doctrine without that name's soon acquiring in common philosophical usage, a signification much broader than was originally intended ... it is time to kiss the child goodbye and to announce the bi¡1h of a new word 'pragmaticism', which is ugly enough to be safe from

kidnappers." (Peirce, 1 932).

Peirce's theory of truth begins with his observation that different people using different methods to explore the same question, more often than not, will arrive at the same conclusion. lndeed, Peirce argues that, given enough time and information, all minds will tend to agree.

"Let any human have enough information and the result will be that he will arrive at the same conclusion that any other mind will reach ... human opinion universally tends, in the long run, towards truth."

171 By itself, this statement may seem a reiteration of the modern narrative of legitimation - the story of science's gradual progression towards transcendental truth. However, Peirce argues that it is not just that different minds tend to agree (which could imply that all minds reach the same transcendentally false conclusion) but that truth is consensus - by definition, the consensual conclusion is true. Accordingly, Peirce defines a proposition as true if and only if it is agreed to by everyone who investigates the matter.

'The opinion which is fated to be ultimately agreed to by all who investigate is what we mean by truth." (Peirce, 1932).

It does not matter how this consensus is achieved, whether by the scientific method, mass hypnosis or the teachings of a guru, it is the consensus that is the truth-determinant and not the method of arriving at it.17e Thus Peirce states: "lf a generat betief can in any way be produced, though it be by the faggot and the rack, to talk of error in such belief is utterly absurd."18o

The question that immediately comes to mind is what is the source of Peirce's optimism?Why is Peirce so confident that all investigators of a particular problem will come to agreement? At first glance it may seem that behind Peirce's pragmatic definition of truth lies a realist ontology. That is, the only propositions that everyone would agree on are those that accurately reflect a single objective (mind-independent) reality. However, Peirce (1932) argues that it is a neo-Kantian idealism, rather than a realist ontology, that drives his theory:

"My social theory of reality, namely, that the real is the ideal in which the community ultimately settles down ... this theory involves phenomenalism, but it is the phenomenalism of Kant, and not that of Hume ... It was the essence of [Kant's] philosophy to regard the real object as determined by the mind ... it was to regard the reality as the normal product of mental action."

According to Peirce, it is the 'collective mind' that constructs what is taken to be reality (by way of agreement) and not'reality'that constructs the agreement.

The most significant difficulty facing Peirce's pragmaticism may now seem obvious. Peirce's optimism regarding the ubiquity of truth does not follow from his social idealism. lf there is no underlying 'real world'forcing itself upon us in a 'mind-independent' manner, then how is it lhal "human opinion universally tends, in the long run, towards truth [consensus]".1n response to this inconsistency, Kirkham (1995) argues lhal "Pierce's theory of truth is plausible only

17s lt is ¡nteresting to note that the reason why Peirce thought science such a valuable enterprise was not because its results corresponded with some objective reality, but because it was such a powerful method for achieving consensus. tto This is similar to the position Strawson ('1 964) takes in his'performative theory'. According to Strawson, truth ascriptions are peÍormative utterances, similar to 'l agree'. Therefore, to utter the sentence 'it is true that snow is white' is not to say anything at all. Rather, it is to agree with the proposition that 'snow is white', signalling the performance of the act of agreeing.

172 because it is parasitic on another, hidden theory of truth - the correspondence theory". For example, consider a community that has had access to sufficient information and time to achieve consensus (i.e. 'created' truth). Peirce would have that the observations, which caused this group to reach the final conclusion were forced on them in a reverse chronological order by the final consensus, which at the time had not yet been achieved (Kirkham, 1995).

Kirkham's argument is a compelling one, however he is incorrect to suppose that he has shown an inconsistency between Peirce's theory of truth and his social idealism. What he has shown is that Peirce's optimism does not necessarily follow from either - there is no reason why Peirce's social idealist ontology and pragmatic theory of truth cannot coexist with a thoroughgoing scepticism about the attainment of truth. lndeed, it seems reasonable to assume that the presence of optimism is related to the belief in either a Kantian-style transcendental unity of apperception or a single unified worldview governing society (thereby causing people to construct essentially the same 'picture' of 'reality'). ln the relative monoculture that characterised late 19th century, post-civil-war, Harvard, Peirce may justifiably claim a single worldview, and therefore regular consensus. However, in the multicultures of the early 21't century postmodern West, it is less likely that consensus will be arrived at by all who investigate. lf consensus equals truth, then today, pragmatic-truth would tend to be the exception rather than the norm.

2.3.3.3 lnstrumentalism

'"The True', to put it briefly, is only the expedient in the way of our thinking, just as the 'right' is only the expedient in the way of our behaving."

- William James lnstrumentalism is the name William James (1909) gave to his theory of truth. Following Kant, James argued that our minds organise and structure all experience by way of basic categories, concepts and intuitions. However unlike Kant, James did not believe that these concepts are built into the mind181. Rather, following Hegel, James argued that the organising concepts the mind uses to structure experience have been passed down to us by the way in which our ancestors structured their own experience. Thus concepts such as time, space, class, the distinction between matter and mind and the distinction between object and attribute are not permanent structures of the mind but inherited ones. This belief is the source of James's understanding of truth. lf the mind structures all experience, and the categories by which the mind does so are not transcendentally universal, then it is possible for different communities to have been taught different categories. Moreover, there is no way any one t"t That is, he rejects the Kantian transcendental unity of apperception.

173 community can shed their mental schemata in order to judge whether their ideas of reality are 'truer'than those of another community. ln the absence of any hope of assessing closeness to mind-independent reality, James (1907) argues that truth should therefore be thought of as those beliefs that are useful to those who believe them:

"Any idea that will cany us prosperously from one paft of our experience to any other part, linking things satisfactorily, working securely, simplifying, saving labor, is true instrumentally ... ideas become true just insofar as they help us to get into satisfactory relations with other parls of our experience."

Truth, therefore, happens to an idea by virtue of it seeming to work in well with the rest of our web of beliefs. lt is, in effect, the compliment we give any idea that fits with the whole inchoate set of assumptions constituting our worldview.

The subjective nature of James' definition of truth may lead to the scenario where one observer believes 81 to be useful (true instrumentally) whilst, at the same time, another observer believes Br-& to be useful (true instrumentally). Thus, truth is relative to the person assessing usefulness (that is, truth is subjective). lndeed, James (1909) acknowledges this stating:

'Truth may vary with the standpoint of the man who holds it."

James' focus on 'usefulness' as the criterion of truth leads to the most common objection to instrumentalism. That is, it does not acknowledge the existence of theoretical truths (truths with no practical value). However, a credible answer to this objection has already been supplied by James (1909a) in his Ihe Meaning of Truth, where he states:

'With the past, tho we suppose ourselves to know it truly, we have no practical relations at all. It is obvious that, altho interests strictly practical have been the original starting-point of our search for true phenomenal descriptions, yet an intrinsic interest in the bare describing function has grown up ... A true ¡dea now means not only one that prepares us for actual perception. lt means also one that might prepare us for merely possible perception, or one that, if spoken, would suggest possible perceptions to others, or suggest actual perceptions which the speaker cannot share".

Thus James gets his retaliation in first by claiming that an intrinsic interest in theoretical descriptions has developed which "seems to be the characteristic human differentia". That is,

174 theoretical descriptions have moved from being 'means to an end', to 'one of the ends' themselves!

2.3.3.4 The Coherence Theory of Truth

"lt is not the real world; it's a world we made up."

- Frank Oppenheimer

The coherence theory of truth views coherence and consistency within a particular framework as sufficient for 'truth' (that is true relative to that framework). Thus, for a proposition to be true, it must cohere with a system of beliefs (or axioms). lt is not just that it is true if and only if it coheres with that system, rather, it is that coherence, and nothing else, is what truth consists of (Walker, 1989). lt thus represents an explicit rejection of the correspondence theory of truth.

A set of two or more beliefs are said to cohere if and only if:

1. Each member of the set is consistent with any subset of others.

2. Each is implied by all the others when used as premises.

The motivation for the coherence theory of truth is a general scepticism in regard to our ability to make absolute statements in regard to matters of fact. To the coherence theorist we are only entitled to claim that something is 'true' if we know it with certainty, and we do not know this of any matters of fact. All of our statements in regard to matters of fact are relative; either to our mental schemata (i.e. our perceptions are theory laden or transcendentally ideal) or to some set of foundational and unprovable assumptions (i.e. our reasoning in regard to matters of fact relies on metaphysical assumptions). As Plato has said 'lruth is severe hard demanding" and anything that passes a test for it must be proven to be true - otherwise what use is the concept of truth (as distinct from other concepts such as 'justification' or'warranted assertibility'). As we have seen, the correspondence account of truth can never provide such proof.

To the coherence theorist, if the mind is implicated in all of our statements about the world, then we can never claim that any such statement is true. Thus, for certainty to be assured, truth must be entirely rational (analogous to strict logical deduction or mathematical proof from a set of axioms). lt is important to realise what the coherence theorist is certain about. S/he is certain that truth is the property a proposition is proven to have when it passes the test for coherence-truth (outlined above). However, the only demonstrable property that such a

175 proposition has is that it coheres with other propositions. As such, the coherence theory is silent on the truth status of its premises and supplies no means whatsoever for adjudicating between competing frameworks, beliefs or axioms (or anything else that acts as a premise). Thus, coherence and consistency within a particular framework is sufficient for'truth'- that is tut.The truth, relative to the framework (see Figure 10) coherence theory, therefore, represents an explicit rejection of the belief that objective access to an 'absolute' or 'real' world is possible. By so doing, it circumvents the problems associated with sceptical arguments.

Figure 10: A Representat¡on of a Web of Beliefslæ

Coherence truth is an entirely rational position (or, in Kant's language, necessary, a prioriand analytic). A proposition is only ever true if it can be rationally deduced from other propositions. As we have seen, this makes coherence truth a relative position (relative to those propositions which act as the premises). This has been seen as a serious defect by those who still want to say something 'truthful' about the empirical world (in Hume's language - 'matters of fact'or, in Kant's language -'a posteriori judgements'). However, if the coherence theorist is to remain true to his/her principle of certainty then s/he must resist the temptation to try to answer the realist's demand for 'truth about the external world'. Specifically, a coherence theorist can never answer the questions that the realist would consider the most important - such as how to judge between competing sets of beliefs. Unfortunately, such questions have proved far too tempting for most coherence theorists. lndeed, even the most famous coherence advocate, Brand Blanshard (1941), has attempted to answer these questions from a 'coherence' position in his book The Nature of Thought.

When confronted with the difficulty of choosing between two conflicting, yet coherent sets of beliefs, Blanshard states that the set of beliefs that gives a complete picture of the world should be retained and the incomplete set abandoned. Unfortunately, this raises more

182 Wittgenstein (1953) perhaps best captures the essence of the coherence theory by stating [hat'. "statements are t*true insofar as they cohere with the entire system of beliefs". Bu ¡s 'True", with respect lo {fu IJ ... U Bdli, because it coheres and 86 ¡s "not true", with respect to the same set. However 86mal be 'Îrue", with respect to some other set, say {Bz U ... U BrI, at the same time as being "not true", with respect lo {81 U ... , BaI.

176 questions than it answers. How do we ascertain what is a complete picture? What if there are multiple complete and coherent pictures? Blanshard's (1941) response to the latter question is that they are not really two separate belief systems at all. One wonders how Blanshard arrived at this conclusion, being almost a direct quote of the logical positivist doctrine of 'conventionalism', which states that empirical equivalent scientific theories are not really two different theories, but the same.

Blanshard's mistake is that he attempts to answer an illegitimate question. The question of adjudicating between competing coherent sets of beliefs is only justified if we think about truth in terms of correspondence (i.e. which set of beliefs corresponds to the one objective, mind- independent reality?). Obviously, the coherence theorist is attempting to re-define truth as something other than correspondence. Therefore objecting to the theory on the grounds that it does not provide a technique for adjudication is akin to judging the coherence theory with respect to the aims of the correspondence theory - which is incoherent (and does not correspond either)184.

2.3.4 Science and Truth: Ontological and Anthropological Positions

"Strictly speaking it may be said that nearly allour knowledge is problematical."

- Marquis de Laplace

2.3.4.1The Modern Narrative of Legitimation: Correspondence Truth in Science

'"The positive argument for realism is that it is the only philosophy that doesn't make the success of science a miracle."

- Hitary Putnam1g5 ln Section 2.3.1 realism is defined as any ontological position that states:

1. That something exists independent of any mind, subject or observer. 2. Thal such and such an ent¡ty can be said to exist in a mind-independent manner.

t* When a coherence theorist attempts to respond to such objections, the resulting theory cannot be otherwise than ¡ncoherent. lndeed it should not be considered a coherence theory of truth at all, but a correspondence theory of truth augmented with a coherence theory of justification (see Bender (1989)). The position can be expressed as follows: a set of beliefs corresponds to the real world if and only if they are coherent and complete. Such a position could only ever be maintained by making an unprovable assumption - that the real world is rational (in the same way that we are). lf this assumption is made then any incoherent set of beliefs can be dismissed as false. What remains is the set of sets of beliefs that are complete and coherent. ln a truly bizane twist, these are all deemed to be identical. t"u After years at the vanguard of realism, Putnam (1 981) abandoned the cause and described his former position as "¡ncoherent".

177 The first claim constitutes the minimal realist position: the belief that something exists independently of the mind. As has been shown, the so-called 'anti-realists' generally find no serious difficulty with this position. Thus, the anti-realists are not actually anti realism. However, as has also been mentioned, the minimal position can, should and has been objected to on the grounds that it provides us with no ontological insight whatsoever. lt is an uninteresting, humdrum philosophical position (about as interesting as its opposite, absolute idealism, which states that nothing exists independent of the mind). Of more interest are the realisms that make some attempt at specifying the nature of the entities that their realism is committed to. lt is these specifications that the 'anti-realists' are typically sceptical about.

Perhaps the most well known contemporary issue in the realism / anti-realism debates are the claims of the scientific realists. differs from other realisms in that it asserts the existence of the unobservable entities of science. Following the abandonment of logical positivism, it has been typical to afford metaphysical conjecture a place within the sciences. Thus, we have .conjecture about curved space-time, elementary particles, point masses and phlogiston all accepted as typical ouput of physics and theories about evolutionary biology (e.9. mutation, selection, catastrophe and punctuated equilibrium) and human psychology (e.g Freudian theories of behaviour and Jungian theories of personality) all accepted as typical outputs of so-called 'higher disciplines'. However, this has raised a whole set of difficulties associated with the status of such unobservables. The scientific realists argue forcefully that these entities must be considered in a realist manner, that is, if they are true, then they must correspond to ontological facts. This is the traditional, modern, narrative of legitimation of science: that science provides ontologicaltruth. ln opposition to the scientific realists, the so-called 'anti-realists' asseft the fundamentally postmodern position: that such a narrative cannot be sustained and therefore the nature of science and truth need to be re{hought.

The most common argument for ontological truth in science actually involves pre-supposing it. That is, it assumes the mind-independent existence of unobservables and then finds that they provide a good explanation of the behaviour and characteristics of observables. Furthermore, such a pre-supposition leads to predictions about observables that seem to work. That is, the pre-supposition is observationally successful (Devitt, 1984). This line of argument relies on a method of reasoning known as abduction. Abduction allows for hypotheses about unobservables to be accepted by observation of the behaviour of observables. Putnam has summed up the abductive argument by stating:

'"The positive argument for realism is that it is the only philosophy that doesn't make the success of science a miracle."

178 Notwithstanding the surface appeal of the abductive argument there have been a number of criticisms of it. lt is obvious that the neo-Humean mitigated sceptic, the neo-Kantian phenomenologist, the positivist and the postmodernist would all have trouble with such reasoning. However, perhaps more telling is that arguments against abduction have been increasingly arising from scientists and philosophers of a generally realist persuasion (Cartwright, 1983; Hacking, 1982; 1983, 1998; Boyd, 1984, 1985, 1992). One such person is Nancy Cartwright. ln her book How the Laws of Physics Lre, Cartwright (1983) devotes a large section of the lntroduction to a rejection of the form of argument that she entitles "inference to the best explanation", that is, abduction. Quoting from Duhem (1904) and Van Fraasen (1980), Cartwright (1983) poses the question:

"Show exactly what about the explanatory relationship tends to guarantee that if x explains y and y is true, then x should be true as well."

Cartwright (1983) concludes that'1he lesson for the truth of fundamental laws is clear: fundamental laws do not govern objects in reality; they govern only objects in models."

The problems associated with ascribing unobservables a truth value with respect to absolute reality (ontology) are perhaps highlighted best by the under-determination thesis, which states that no body of evidence supports any theory to the exclusion of all rivals. That is, no set of actual observations can support an abductive inference to the existence of unobservables to the exclusion of other possible explanations. lndeed, our theories about unobservables are under-determined by the (observable) evidence and therefore any over- arching theory can be reconciled with any evidence, as highlighted by Lakatos (1970) in his hypothetical scenario involving Newtonian gravitation. lndeed, even Popper (1969) accepts this position and makes it the centre-piece of his arguments against verification and for falsification in Conjectures and Refutations. The under-determination of evidence for a specific theory highlights the idleness of arguments for scientific realism based on abduction. The thesis is well understood and can be found in the writings of many commentators, including:

a Quine, who states lhal: "any theory can be held true, come what may" (Quine and Ullian,

1 970) a Kuhn, who declares that: 'there never comes a point at which it is unscientific to hang onto an old paradigm"(Kuhn, 1998). a Feyerabend, who insists lhal: "anything goes, including hanging onto any theory one

likes" (Feyerabend, 1 975). a Lakatos, who holds that any theory can be made to look good, "provided enough bright people commit their talents fo ft"(Lakatos, 1970).

179 a Goodman, who maintains that there does not exist any objective ground for choice between rival "ways of world-makrng"(Goodman, 1978). Popper, who holds that theories can never be deduced from their positive instances (Popper, 1972). Duhem, who claims that a crucial experiment is impossible in science (Duhem, 1904).

Scientific realists typically claim that science aims at (correspondence) truth. The more naiVe of these affirm that scientific theories provide us with a literally true story of what the world is like and that acceptance of a scientific theory involves the belief that it is true (i.e. it corresponds). This position is simply not defendable given a basic understanding of Hume, Kant, the Quine-Duhem thesis, the under-determination thesis or the history of science. lncreasingly, however, more sophisticated versions of scientific realism are arising. These 'critical scientific realisms' accept many of the sceptical arguments against the naiVe position. For example, Devitt (1984) concedes that:

"Scientific realism takes the posits of science pretty much at face value. However, it is committed only to most of those posited by the theories that we have good reason to believe: it is committed to science's'confídent' posits."

Furthermore, he states that:

'The realist does not recommend to scientists that they should believe strongly in the entities of all theories, only those of the established theories that we have the best grounds to accept."

Devitt's (1984) use of terms such as'confident'and'established'reflect an acceptance of a psycho-social process lying behind those theoretical entities which are accepted as corresponding and those that are not. However, his use of terms such as 'best grounds' reflects a form of Cartesian foundationalism in regard to scientific reasoning and theory choice. Thus the more critical scientific realists will typically admit that socio-cultural attitudes influence theory choice, but maintain that, despite this, scientific theories can be accepted and rejected on purely rational criteria. Furthermore, the critical scientific realist will typically claim that, in the long run, it is the rational aspect of science that overwhelms the socio- cultural one. Thus, rather than claiming that scientific theories are true, critical scientific realists typically claim thal science has made progress toward truth. We cannot know that our current theories are true, but we can know that they are truer than earlier theories. That is, they correspond closer.

As has been discussed (Chapter 2.2), the problems associated with scientific progress and theory choice have been the focus of much debate in the literature. ln general the

180 postmoderns rema¡n scept¡cal about any attempt to claim that one theory corresponds closer to mind-independent reality than any other, whilst the moderns argue for a view of theory choice that affirms the Enlightenment meta-narrative about the steady march of science towards greater truth and utility. lt is this latter position that Popper (1983) argues for, stating: "later theories could always explain everything their predecessors could, and more besides". Thus science progresses by convergence towards truth, or in Popper's words '\terisimilitude" (Harris, 1974)186. Verisimilitude is a function of the relative truth and falsity content of theories being compared. Theory A has greater verisimilitude than theory B if theory A has more true consequences and fewer false ones. However, both David Miller (1974) and Pavel Tichy (1974) showed that this definition is untenable: no false theory could have greater verisimilitude than any other false theory. Furthermore, the notion of 'distance from truth' involves some sort of linear ordering of theories. As it turns out, it is not mathematically possible to well order more than two variables at the same time (Cole, 1998). Because almost all scientific theories of explanation are composed of numerous 'variables' the idea of verisimilitude becomes non-sensical.

2.3.4.2 Postmodern Scepticism Towards The Modern Narrative: The Critique of Correspondence Truth in Science

"Laws of nature are not eternal, abstract truths. They are patterns that seem to prevail in some chosen context."

- lan Stewaft

There are many contemporary philosophers of science who doubt the veracity of the modern narrative187. For these thinkers, ascribing the characteristic 'approximation to ontological truth' to the aims or outputs of science is misguided and arrogant, insofar as it claims more than can be reasonably defended. Accordingly, the postmodern position states that scientific entities should not be construed as ontologically true but 'true', or perhaps 'warranted', in some anthropological sense. The reasons for this, as we have seen, are not because the postmoderns are absolute idealists or'anti-realism', but because they hold a thoroughgoing scepticism towards science's ability to successfully legitimate its own narrative.

ttu Popper's argument for scientific realism can be briefly summarised as follows: Refutable theories imply points that potentially clash with the real world. The more highly refutable a theory, the more points it implies that could clash with the real world. When a theory is corroborated (when it is tested at a refutable po¡nt and passes the test), it thus reveals something about the real world, at least at the point(s) where it was tested successfully. On the other hand, when a theory is refuted (when it is tested at a refutable point and fails the test) it is shown incorrect about the real world at that point. 1u7 See Chalmers (1 990; 2000), Davidson (1 983; 1 984; 1 990; 2001 ), Dummett (1 976; 1977; 1978: 1 982), Feyerabend (1975;1977:1981;1987;1991), Fine (1986;1989;1991;1996; 1998), Goodman (1978;1984), Kuhn (1962;1963; 1970a: 1970b; 1977; 1998), Putnam (1981; 1983; 1985; 1987), Quine (1953a; 1969), Rorty (1979; 1982; 1991a; 1991b; 1998) and Van Fraasen (1980; 1982).

181 ln thís Section we describe three distinct arguments against the modern narrative of ascribing correspondence truth to successful scientific theories (and unobservable entities) and suggest three different types of postmodern attìtudes that may be more appropriate. The arguments against the modern position may be defined as follows:

1. The scientific argument: successful scientif ic theories, such as quantum mechanics, elementary-particle theory, general relativity, efc undermine realism.

2. The historical argument: most of the best scientific theories of the past have now been discarded and the history of successful scientific theories does not the development of a single, ever more accurate, and detailed picture of the world, but of discontinuous pictures of the world.

3. The philosophical argument: the modern position makes use of metaphysical notions of 'truth' as some sort of 'correspondence' with an inaccessible 'external world'. No philosophic sense can be made of the central metaphor of 'correspondence' and, furthermore, we have no epistemic access to mind- independent reality.

The postmodern views these three arguments as indicating that there is something misleading about the modern claim that "a scientist discovers what the world is really like". Accordingly, s/he prefers to construe scientific theories in a subjective (or relative) manner, after one of the anthropological theories of truth. For example:

1. Pragmaticism: scientific theories can be said to be useful insofar as they lead us to consensus.

2. lnstrumentalism: scientific theories can be said to be useful insofar as they help us to simplify and link observed phenomena satisfactorily.

3. Coherence: scientific theories can be said to be 'rational' or 'coherent' with another set of theories (which are taken to be axiomatic) but never 'true about the external world' insofar as they admit no doubt.

The scientific argument against the modern position arose after the First World War in response to the controversial 'Copenhagen interpretation' of sub-atomic physics. Two principle components of the Copenhagen interpretation contributed to an attack on realism.' the complementarity principle of Neils Bohr (1928; 1935; 1961) and the uncertainty principle of Werner Heisenberg (1927; 1958).

182 The complementarity principle arose after Erwin Schroedinger (1926) showed that both quantum theory and wave theory could explain quantised phenomena. Unable to resolve the inconsistencies between the wave and quantum interpretations, Bohr (1928) formulated the complementarity principle, which claimed that the two models are complements of each other. The proposition that inconsistent theories are acceptable in science proved particularly disconcerting to scientific realists. Following closely behind the complementarity principle came Heisenberg's (1927) uncertainty principle, of which the best known instantiation is that the precision with which the position of an electron can be determined is proportional to the imprecision with which its momentum can be known and vice versa. The reason it is impossible to be more precise about subatomic structure is that the measuring medium (electromagnetic radiation) interferes with the electron, and thus with the parameters to be measured (position and momentum). ln the eyes of many, the uncertainty principle pushed physics into a totally new impasse, it would be impossible to gain reliable knowledge of structure smaller than that already discovered. According to Bell (1994), the uncertainty principle created a methodological faux pas Íor physicists; the purpose of method is to help extend knowledge, not trip up the attempt188.

Each in their own way, the complementarity and uncertainty principles, suggested that theories about 'underlying structure' will always be highly speculalive and thus difficult (if not impossible) to test empirically. Such suggestions are the beginnings of postmodernity in regard to science. As has been alluded to previously, one of the bestkept secrets in the philosophy of science in the first half of the twentieth century was that the logical positivists themselves were radical relativists about theory choice. ln a somewhat ironical twist, the logical positivism of the Vienna Circle, which represents the most sustained intellectual effort to grant science a superior status to all other forms of knowledge, finds itself grouped with the likes of Feyerabend, Goodman, Kuhn, Rorty and the later Putnam when discussing the issue of scientific realism.

The positivists were well versed with observational evidence under-determining theory selection. lndeed, much of the 1920s research program in Vienna was associated with attempting to solve the problem of empirical equivalence. However, the positivists could not find any 'realistic' solution and retreated into contextual empiricism with the theory of 'conventionalism', which stated that the disagreement between two empirically equivalent yet distinct theories was but apparent. This was followed quickly by the verifiability ttt Following Bohr and Heisenberg, more recent quantum theory has been also said to challenge the modern position. Of particular note is the Bell inequality, which Putnam (1981) claims was one of the prime motivators of his abandonment of the realist cause and which Van Fraasen (1982) labels the "charybdis of realism". lndeed, quantum theory has by in large discarded the notion of a 'substantial' universe composed of individual particles that possess essences within themselves, choosing to see the world as composed of dynamic relationships that only within cerlain contexts could be said to form particles. Thus the existence of elementary particles is much more dependent on the context than the realist would typically prefer (Grenz, 1996). Fine (1986) in his excellent discussion of realism and quantum theory, Ihe Shaky Game, concludes that whilst quantum theory is not inconsistent with what has been described here as minimal realism (it does not falsify the notion of some'real' structure corresponding to the systems,

183 pr¡nc¡ple, which stated that observable facts were the only absolute truth and that the cognitive meaning of a scientific theory consisted solely in its implications for actual and possible observations. The difference between the realist and positivist response to empirical equivalence is illuminating. For the realist, the issue was much more problematic as they were committed to affording structural conjecture a place in science and claiming that such conjecture could be said to correspond. The positivist, on the other hand, was only committed to the empirical posits of science and could bound structural conjecture out of scientific activity. Both, however, were committed to the idea that science could provide certainty (absolute truth) via some soft of objective access to phenomena, and it is this claim, which ultimately undoes both positions.

A second source of doubt about the modern position, find its roots in a historical reflection. Many theories of the past that made claims about unobservable entities have since been rejectedlse. Newton's particle theory of light, the caloric theory of heat and Maxwell's electromagnetic theory are just some examples (Chalmers, 2000). The critical scientific realist responds by conceding that, in the light of such historical reflections, it would be inappropriate to claim that our current theories are true. However, s/he rejoins by claiming that we can know that they are closer to truth than our previous theories and proceeds to construct an 'asymptotic' or'limit' account of theory development. This position has already been argued against on philosophic grounds, however, it can also be dismissed on historical grounds.

According to Kuhn, Feyerabend and many other historians of science, theory change is rarely cumulative, as Popper and the critical scientific realists would have us believe. The writings of these historians teem with example after example of theories adopted by the relevant community even though they failed to solve all the problems solved by their predecessors (Laudan, 1996). lndeed, one of the most striking messages to come out of the Kuhn-Feyerabend corpus is that there are losses as well as gains in most instances of theory changele0. Thus both Kuhn and Feyerabend argue that science is not progressive and hence that verisimilitude cannot be justified, lndeed, their historical writings can be thought of as a falsification of Popper's position. lnstead of tracing the progress of a single ever-truer picture of reality, Kuhn and Feyerabend find that the history of science is littered with discontinuous jumps where entire theories are abandoned in favour of alternative ones in what they term a scientific revolution. This implies that the idea of 'progress' towards truth is flawed. As Kuhn (1993)states:

observables, states and probabilities of contemporary quantum theory), the 'game' of offering realist interpretations seems to be at best'shakv'. ttn Furthermore, when we have posited unobservables in the past, we have often been wrong; so by abduction (the realist's probably posited tto favourite form of argument) we are wrong about the unobservables currently. Kuhn and Feyerabend have produced an impressive list of case studies which suppod this contention (Feyerabend, 1975, 1987, 1 991 ; Kuhn, 1 962, 1 963, 1 977, 1 998, 2000)

184 "l aim to deny all meaning to claims that successive scientific beliefs become more and more probable or better and better approximations to the truth and simultaneously to suggest that the subject of truth claims cannot be a relation between beliefs and a putatively mind- independent or external world." ln light of such historical studies, the postmoderns point out that, just as theories in the past proved popular as explanations and successful as predictors, in spite of the fact they were later thought to be falsified, so it is reasonable to assume the same about contemporary ones. This form of abduction does not refute the modern position. However, it does undermine both the 'success' argument for the naiVe realist position and the 'progress' argument for the critical realist position. And, as luck would have it, it undermines Boyd's (1985) conflation of the two in his 'dialectic'argument as well.

The third source of doubt about the modern position comes from philosophical arguments. These usually take one of two forms:

1. Neo-Humean Empiricist Arguments: Which claim that statements about unobservable scientific entities (and theories of reality) must be hypothetical as they transcend that which can be firmly established by observation.

2. Neo-Kantian Postmodernist Arguments: Which claim that statements about mind-independent scientific entities (and theories of reality) must be hypothetical as they transcend that which can be firmly established when the mind is implicated in every interpretation of reality.

Whilst this thesis agrees with the scepticism of the empiricist (or positivist) about the claims of scientific realism, it argues against its reasoning. ln particular, that there is something epistemically significant about observation, in the sense that observing O yields incorrigible knowledge of O. Rather, it is argued that the Neo-Kantian, postmodern, position is the more robust, neither requiring epistemic significance of observation (thereby not limiting its scepticism lo scientific realism) nor requiring either certainty or truth of the products of scientific practice.

The source of the empiricist scepticism toward scientific realism should by now be evident. Traditional empiricist / positivist epistemology holds that observing O (usually a sense datum) yields incorrigible knowledge oÍ O. Furthermore, it is exactly this type of incorrigible knowledge that is mandated of all scientific knowledge, thereby reducing science to discovering relationships between sense datum and nothing more. The scientific realist, however, wishes to radically increase the domain of applicability of science and include such things as metaphysical speculation, structural conjecture and theories of origins. Such

185 'theoretical' posits, however, are not derivable solely from sense datum and require a set of auxiliary assumptions to hold. The positivist rejects outright that these theoretical posits can be said to have the same epistemic status as that of sense datum. ln particular, the positivist argues that sense datum can be said to be real and that the theoretical posits can only ever be ideal. The scientific realist disagrees, claiming that both may correspond. Once again, it is important to understand exactly what it is that the positivist is sceptical about; s/he is sceptical about unobservability.

A question that immediately comes to mind is: what exactly does the positivist mean by obseruabilitfl Surely they do not attach especial significance solely to sight. lf not, then how is the notion of observability to be applied to other senses? The positivist response is that when a human observes an object O, s/he will arrive at beliefs about O as a result of its stimulation of a sense organ in an appropriate way (Devitt, 1984). Therefore, to observe O is to see, hear, touch, smell or taste O. However, as soon as other senses come into play, 'observability' loses much of its apparent authority. This is because the belief formed about O as a result of sensory stimulation is not given solely by the sensory input but the result of human processing and understanding. That is, it is in some way, theory-laden. Any attempt to attach especial epistemic significance to observability misunderstands the nature of Kant's Copernican revolution. According to Kant's Transcendental Aesthetic, all sensible judgements are intuition-laden (including apparently objective observations). Thus, the entities we come to believe in, as a result of observat¡on, are already, in some sense, theoretical posits. Accordingly, the significance attached to the observation / theory distinction dissolves and there is no'special problem'of theoretical entitieslel.

With the fall of the observation / theory distinction, the positivist rejection of scientific realism loses much of its credibility (as its raison d'etre is found in attaching some special epistemic status to observability). Both the unobservable and observable entities of science now have the same status. Either both can be said to correspond or both cannot. However, theory- ladenness not only reduces observability to the same status as unobservability, it also critiques any attempt to suggest that either can be said to correspond. To be able to single out correspondence between two domains, one needs independent access to both. However, since Kant's Critique of Pure Reason (Kant, 1781) and the acceptance of the theory- ladenness of observation, it is difficult to argue that independent access to the 'real world' is possible. Thus, the idea of correspondence is considered to be at best obscure and at worst incoherent. lf truth holds for a scientific theory in virtue of it corresponding with mind- independent facts, then we will inevitably need some explanation of how minds can access

tnt A sim¡lar point is made by Grover Maxwell (1998) in'-lhe Ontologicat Status of Theoretical Entities" where he concludes that: 'bur drawìng of the observational-theoretical line at any given point is an acc¡dent and a function of our phys¡olog¡cal makeup, our current state of knowledge andthe ¡nstruments we have available" (Maxwell, 1998). Thus, the observation / theory distinction is contingent (on our abilities to perceive) and not absolute: what is unobservable today may be observable tomorrow. lt is therefore an inappropriate tool for partitioning the real from the ideal.

186 mind-independent facts (or, as Kant would have, given that all understanding is conditioned, why is it that we always seek to refer to unconditioned reality). A number of arguments have been put forward against correspondence. lndeed, many of these have already been presented. However, Hilary Putnam perhaps sums the situation best (when explaining the reasons for his eventual rejection of scientific realism) by stating:

'We cannot compare theories with 'unconceptualised reality'. Reference requires a transcendental match between our representations and the world in itself: a God's eye v¡ewl92",

That is, the correspondence theory of truth requires knowing the unknowable and speaking the unspeakablele3. Notwithstanding these difficulties, the overwhelming majority of scientists (and a significant proportion of philosophers of science) lean toward some form of scientific realism (Kourany, 1 998)1s4.

The upshot of all this is that it is not just unobservables that are theory-laden and fallible, but all descriptions of the world. Nothing can be said to correspond with mind-independent reality and therefore all attempted descriptions of the world involve some sort of epistemic risk. This, broadly speaking, is the position of the collection of schools that this thesis labels as postmodern. Among these are the sociologists of knowledge (Bloor, Hesse, efc), radical epistemologists (Kuhn, Feyerabend, etc), contemporary pragmatists (Rorty, the later Putnam, efc) and post-structuralists (Foucault, Derrida, Lyotard, efc). Acceptance of the postmodern arguments against scientific realism means conceding that there must always be some form of ontological doubt about the assumed 'real' existence of scientific entities and, therefore, a corresponding epistemic risk. This, in turn, should lead to a position of epistemic humility towards the pronouncements of scienceles.

1n2 Much to the chagrin of Devitt (1984) and others, who have labelled him as a renegade, Hilary Putnam, after years atls the vanguard of scientific real¡sm, abandoned his cause describing ¡t as incoherent. The view is of no value unless it is accompanied by some instructions on how to tell the degree to which a theory is true, or, how to tell that one theory is closer to truth than another (i.e. has greater verisimilitude). ln the absence of such criteria, talk of correspondence truth becomes meaningless. tno The dominant view of scientific progress is still the realist one: that science advances towards the goal of truth. However the sustained sceptical attacks on the idea of scientific progress have spawned a number of alternative candidates for the goal of science. These include simplicity, coherence and explanatory power, none of which necessarily entail (correspondence) truth. les Scientific theories should never be thought of as true. A more appropriate attitude to scientific theories is that they should be considered in a manner after a theory of truth that does not pre-suppose realism. For example: pragmaticism - scientific theories can be said to be useful insofar as they lead us to consensus; instrumentalism - scientific theories can be said to be useful insofar as they help us to simplify and link observed phenomena satisfactorily, leading us to correlate and predict the results of observation and experiment. Henri Poincare (1952) exemplified this position when he compared theories to a library catalogue. Catalogues can be appraised for their usefulness, but it would be folly to think of them as true or false in the sense that they correspond to reality; coherence - scientific theories can be said to be 'coherent with another set of theories' (which are taken to be axiomatic) but never 'true about the external world' insofar as they admit no doubt. Thus, theories may be 'true' with respect to a chosen theoretical framework but never in an absolute sense.

187 2.3.5 Conclusions: The Legitimacy and lllegitimacy of Truth

"Now we see but through a glass darkly, then we will see as it truly is."

- Paul of Tarsus

2.3.5.1OntologicalTruth or AnthropologicalTruth: The Case Against OntologicalTruth

"Out of the crooked timber of humanity no straight thing was ever made."

- lmmanuelKant

Various attempts have been made to address the issue of what is real (ontology), what we know (epistemology) and the language we use to communicate what we know (linguistics, semantics, semiology etc). Unfortunately, it is widely held that we cannot know the exact relationship between knowledge, the language we use to frame knowledge and reality (Midgley, 2000). Notwithstanding, it is important to separate two issues in order to shed some light on our discussion of the modern and postmodern positions in regard to the relationship between science and truth:

1. The ontological issue: Whether there is a world that is independent of the way we experience it, a real world, which tells us about whether our theories about the world are correct.

2. The epistemological issue: Whether we can have independent access to this real world, assuming that it exists.

ln response to the ontological question two positions may be taken

1. Ontological realism: which claims that there is a mind-independent 'real world', whether we can know it or not1s6.

2. Ontological idealism: which claims that there is no 'real world', other than that which is created by our own minds.

As we have shown, many of the so-called 'anti-realists' of both antiquity and in present times have no problem with the minimal position of ontological realism. That is, they accept the existence of a mind-independent, or 'real', world. Thus the pyrrhonists, the academics, Hume's mitigated scepticism, Kant's transcendental idealism, the positivists and the

188 postmoderns all would accept ontological realism when the quest¡on is put in the above manner. lndeed, it is difficult to find adherents of the alternative (ontological idealism), which holds that the existence of the everyday world of experience is completely mind-determined. Given the above, the question that naturally arises is: if the anti-realists are actually realist, then why are they perpetually labelled as 'anti-realist'? lt is argued that the cause of such mislabelling is that much of the debate between realism and anti-realism in the literature confuses the ontological issue with the epistemic one. Specifically, realists continually attempt to dismiss sceptical arguments on ontological grounds, assuming that sceptics of realism must be absolute idealists.

lf the 'anti-realists' are not anti-realism then what is their disagreement with the realist? The answer to this question falls naturally out of the above distinction between the ontological issue and the epistemic one. That is, their disagreement does not involve the existence of mind-independent reality (i.e. the ontology of the real), but nature of our knowledge of mind- independent reality (i.e. the epistemology of the real). Thus, whilst both may accept the minimal realist ontological position, they d¡sagree on the appropriate epistemic attitude towards this 'real existence'. Specifically, they disagree about epistemic accessibility. For the minimal realist, there are two broad epistemological positions available:

Epistemic accessibility: which claims that the real world is accessible and that we can obtain objective and absolute knowledge of this world via some form of interaction with it. The possibility of epistemic access leads to a search for appropriate methods capable of yielding epistemic certainty. This is the traditional (modern) view of epistemology and is the basis of what has been labelled'scientific realism' - the view that the results of science can be associated with ontological truth.

2. Epistemic inaccessibility: which claims that if there is a real world, then what we know is not the world as it 'truly is', but the world as it is to us. That is, the world filtered, interpreted and, in impoftant ways, 'constructed' by our a priorifaculties. Thus, we can only ever know things "in the subjectively determined modes of our own thinking and not as they are in themselves" (Magee, 1985). This position rejects the possibility of mind-independent, epistemic access, thereby rendering the search for methods that may yield certa¡nty obsolete. According to this position, the traditional (modern) view of epistemology is unworkable. Furthermore, inaccessibility naturally leads to a position of epistemic humility. This is the emerging (postmodern) view of epistemology and it is in response to this view that the anthropologicaltheories of truth have been developed.

ltu Ontological realism is silent on the epistemic question of knowledge.

189 By rejecting the modern doctrine of accessibility, the postmoderns reject the entire meta- narrative built around it - that we may overcome our own subjectivity through appropriate 'objectifying' methods (such as the 'scientific method') and, therefore, gain access to objective knowledge. ln contrast, the postmoderns argue against any claims of objectivity and suggest that the pre-supposition of accessibility cannot withstand the attack made upon subject-object dualism by Kant (1781; 1783; 1786;1788;1798), scientific method by Kuhn (1962; 1963; 1970a; 1970b; 1977; 1990) and Feyerabend (1975; 1978; 1981; 1987; 1991) or foundationalism in general by the pragmatists and post-structuralists alike (Nietzsche, 1873; 1879;1885; 1886; 1887a; 1889; Derrida, 1966a; 1966c; 1966d; 1972a; Rotly, 1979;1982; 1991a; 1991b; 1998; Baudrillard, 1981; 1983; 1987; Foucault, 1966; 1969). According to allof these, the subject is always implicated in all object conceptualisations and therefore absolute knowledge of objects is impossible. Accordingly, many argue that a pragmatic neutral attitude towards ontology be adopted. As Kant stated some 200 years ago:

'Wir sehen das innere der dinge gar nichte ein." (We have no knowledge of things in themselves.)

lf this is the case, then the ontological (correspondence) theories of truth, which rely on knowledge of things in themselves, are obviously impractical. Better to adopt a position of ontological neutrality and epistemic humility. And what better way to secure these than to do away with correspondence truth altogether and re-interpret the word in anthropocentric terms.

2.3.5.2 OntologicalTruth or AnthropologicalTruth: The Case For OntologicalTruth

'What is crooked cannot be made straight, and what is lacking cannot be numbered."

- King Solomon

ln light of the arguments against correspondence, what then is to become of truth?

Do we relativise truth, as the anthropological theories suggest? Certainly Rorty (1991a) would suggest not: "/f seems pointlessly paradoxical to relativise truth ... true for me but not for you and true in my culture but not in yours are weird pointless locutions". Truth, historically understood, is absolute!

However, the arguments against the historical understanding of truth (i.e. the ontological theories) are compelling. Once the mind has been implicated in every observation (and subsequent understanding) of the world and Cartesian subject-object dualism brought into question, it is difficult not to arrive at the conclusion that truth-as-correspondence is

190 unknowable and therefore an entirely vacuous conceptlsT. ln opposition to the ontological theories of truth, many have realised that whilst'true for me and not for you' makes no sense, lustified for me and not for you' makes perfect sense. Justification (or legitimation), as we have seen, is an entirely relative construct! ln essence, what the anthropological theories of truth do is overcome the disjunction between justifying what is put forward as truth (which will always be relative) and truth itself (which is absolute) by redefining truth as justification. Truth is no longer thought of as an internal propeny of a belief by way of some relation from language through meaning to ontology. Rather, it is thought of as something projected onto beliefs by way of that belief meeting some set of man-made conditions (e.9. the belief has achieved consensus, or it is instrumentally useful or it 'fits' (coheres) with other beliefs). By so doing, the anthropological theories strip 'truth' of its majesty in lieu of the poverty of its attainability (see Figure 11 below):

Anthropological Theories of Truth Relative

Ontological Theories of Truth Absolute

A true belief involves some form of reference (or correspondence) from language through meaning and towards ontology. A true belief does not involve any form of reference lo the real world. Rather, truth is the name we give to beliefs that meet some set of man-made conditions.

Figure 11: Ontological and Anthropological Theories of Truth

Such a position can only ever be legitimated by arguing that the sheer unattainability of (ontological) truth makes the notion of its existence pointless. lf it is impossible to ever know whether we are getting closer to, or further away from, (ontological) truth (i.e. if truth can never be a legitimate goal of inquiry), then what use does the notion serve? Moreover, if our only criterion of truth is justification, then why make a distinction between the two? Better to rid ourselves of the notion of ontological truth and be content to work with the notion of anthropological truths (i.e. what was previously termed justification). tnt Such a conclusion dissolves the central puzzle ol modern epistemology: to find objective foundations for knowledge, thereby ensuring we have the ability to uncover the truths of the universe. With the apparent death of epistemology the reader is entitled to ask what, if anything, will take its place? lndeed, this is the exact question which contemporary philosophers of language have asked, concluding that linguistics will be the "epistemology of the future" (Dummett, 1982). Rorty (1979) has labelled this the linguistic turn and argued that it is just another attempt to fix a foundation for knowledge. The linguist accepts that epistemology has been dissolved, but rather than reconstruct his/her view of knowledge (as needing foundations), sheihe attempts to grant philosophy of language the same role that epistemology once had: that of grounding knowledge. Thus, Dummett (1982) sees philosophy of language (and in particular a theory of meaning) as foundational. Rorty (1979), on the other hand, is opposed to this, arguing that 'The cultural space left by the death of ep¡stemology not be filled". I am inclined to agree with Rorty. lndeed the retreat into the philosophy of language seems to be simply another attempt at securing foundations for knowledge.

191 However, with the realisation of the anthropocentric nature of justifying truth claims, an alternative path (to the one chosen by the anthropological theories of truth) lays open. This path sees benefit in retaining the notion of absolute (ontological) truth even in the face of its unattainability. lt suggests that whilst truth can no longer lay claim to being a legitimate goal of inquiry, the concept may yet prove useful. And so we return to the question we began with: what then is to become of truth? To answer this question we will need to return, once again, to lmmanuel Kant.

The reader will recall that Kant was the f irst thinker to implicate lhe a prioristructures of the mind in every observation, description and interpretation of the world. As such, he inaugurated the philosophic tradition that has given us the critique of the ontological (correspondence) theories of truth and the development of the anthropological theories. However, Kant's critique also provides us with a reason for retaining the notion of absolute (ontological) truth. This reason is found in his much overlooked transcendental dialectic.

ln the dialectic, Kant discusses the transcendental illusion - the illusion that human reason is capable of yielding knowledge of things in themselves (a.k.a noumena). As we have seen, the implication of Kant's Copernican shift is that we have no knowledge of pure, unconditioned, noumenal reality. As such, what we know is always impure, conditioned, phenomena. ln short, it is anthropocentric. All of this seems to indicate that correspondence truth should be discarded and the radically anthropocentric nature of what is brought forth as truth embraced through appropriate re-definitions. However, Kant argues that despite knowledge of the limits of reason (in our case, the anthropocentric nature of truth claims), we nonetheless constantly succumb to the transcendental illusion and think beyond these limits (in our case, we continue to make truth claims). This represents a fundamental problem for post-Kantian thought. Having cleared a path for epistemic humility how is this humility to be maintained?

Kant's response to this problem was to develop a set of a prioriconcepts of 'pure reason'. These capture the kinds of things that need to be known !! we are to grasp the nature of unconditioned, noumenal reality. However, as Kant said: '?he 'understanding' is not in a position to yield even the mere project of any one of these [concepts]". Thus, by their obvious unknowability, the a prioriconcepts of 'pure reason' witness to the conditioned nature of all knowledge. And therein lies the significance of the transcendental dialectic to our discussion of the value of the correspondence theories of truth. By their obvious unattainability, the ontological theories witness to the anthropocentric nature of all truth claims. Paradoxically, by retaining the idea of an absolute (yet unknowable) ontological truth of the matter, we help secure the postmodern turn against the epistemic arrogance of traditional metanarratives of legitimation and towards a position of epistemic humility. As such, the case for truth is made not on the basis of its attainability, but on the basis of its unattainability.

192 The truth is we must form an unlikely alliance between Jesus and Socrates, Plato and Heraclitus, Descartes and Kant, Newton and Nietzsche, Carnap and Kuhn, Popper and Feyerabend, Schleiermacher and Derrida. From the former in each of these pairings, we retain the idea of (capital T) Truth. Ontological Truth. Truth as correspondence to mind- independent, unconditioned, noumenal reality - 'The only truth worthy of the name" - lhe ideal that, in the end, keeps us humble. From the latter, we ate "awakened from our dogmatic slumbef'and understand that what is brought forth as (small t) truth is always "a sum of human relations ... which after long use seem firm, canonical, obligatory". Thal is, truth is historically, genealogically, conceptually and methodologically conditioned and never representative of unconditioned reality. However, from both we begin to realise that the greatest story ever told during the modern era, the story of science's cumulative progression towards ontological truth, cannot be sustained. Thus, the incredulity towards narratives of legitimation that arose out of the epistemic story of postmodernism (Part One) remains intact in the face of the methodological story of philosophy of science (Part Two).

193