Book, Incomplete Nature: How Mind Emerged from Matter

Total Page:16

File Type:pdf, Size:1020Kb

Book, Incomplete Nature: How Mind Emerged from Matter Cybernetics and Human Knowing. Vol. 25 (2018), nos. 2-3, pp. 173–179 The Self Is Something Less, Not More, than Matter Liqian Zhou1 A review of Jeremy Sherman’s Neither Ghost nor Machine: The Emergence and Nature of Selves, Columbia University Press, New York, USA, 2017. 295 Pages, ISBN: 9780231173332. The emergence and nature of life and mind have long been seen as two of the most fundamental questions in science and philosophy but with no satisfied answers since the very beginning of human civilization. Finally, Terrence Deacon provides a fundamental insight to the questions in his 2012 book, Incomplete Nature: How Mind Emerged from Matter. However, because of neologism, writing style and inappropriate editing work, the book leads to many misunderstandings. The book I review here is Jeremy Sherman’s Neither Ghost nor Machine: The Emergence and Nature of Selves, which aims to give a brief and simplified reformulation of those ideas in Incomplete Nature. Purposeful phenomena, like intentionality, self, consciousness, qualia, value, and so forth, are often seen as things fundamentally different from physical processes. Thus, we have to work with two realms: cause-and-effect and means-to-ends. However, if we believe that the means-to-ends realm exists in the physical world, then, where is the place of them in physical nature? How can purposeful phenomena as non-physical processes have physical consequences? How to bridge is and should? How to explain these phenomena constitutes a large part of the studies in contemporary science and philosophy, like AI, cognitive science, and philosophy of mind. There are two ways of approaching the problem: top-down and bottom-up. Since the human mind distinctively shows all these characters, the most intuitive way is to study and stimulate the brain, especially the neural systems, which we intuitively think embodies mind. This is what we call brain science or neuroscience. Some try to build artificial models simulating the functions of brain. This is what we call artificial intelligence. Now those sciences of mind integrate with each other into cognitive science with other disciplines that are thought of as being relevant to mind, like psychology, linguistics, and logic. Most of the accounts of mind today follow a top- down way. As we can see, the approach takes the ontological assumption of mind and matter as being two distinctive things for granted and then investigates the relationship between mind and matter. It is easy for the approach to fall in one of two positions: panpsychism or eliminativism. Panpsychism takes mind as something basic and unexplainable, while eliminativism argues that mind is illusory—that only 1. Department of Philosophy, Nanjing University. Email: [email protected] 174 Liqian Zhou physical processes exist and nothing more. David Chalmers’s (1996) double dimension theory, which claims that consciousness is a basic property of the universe like physical properties, and Thomas Nagel’s account of mind as a basic property of cosmos (Nagel, 2012), stand with panpsychism. Behaviorism (Ryle, 1949), computational functionalism (Putnam, 1973), and eliminative materialism (Churchland, 1981) stand with the machine position. Rather than beginning with the most sophisticated human mind and taking it for granted, the bottom-up approach goes the other way around. Unlike the top-down approach which asks questions like “What is the nature of mind?” and “What is the relationship between mind and brain?” A bottom-up approach asks different ones such as how mind and self in its minimal sense emerged from matter. This is a way taken by few since it is a harder one. Nevertheless, Jeremy Sherman’s book, Neither Ghost nor Machine: The Emergence and Nature of Selves, exemplifies a solution with respect to the bottom-up approach. After working with Deacon for more than two decades, Sherman writes the book as a beginner’s guide of Incomplete Nature for lay audiences. Sherman is an independent scholar and an excellent science blogger for Psychology Today whose blog has been viewed more than 4 million times. He has a masters in Public Policy from Berkeley and a PhD in decision theory and evolutionary theory from Union Institute and University. Sherman does not aim to do an original work but a clear reformulation of Deacon’s idea in Incomplete Nature to show how Deacon’s account is a paradigm shift of ontology, methodology and theory. However, I will show that it does not mean that this is not a creative work. Learning from the feedbacks about Incomplete Nature from a variety of readers, Sherman carefully arranges the structure of the book. He divides the book into seven parts: I) Overview; II) Framing the Mystery; III) Dead Ends, Live Clues; IV) Grounding a Solution; V) Deacon’s Solution; VI) The Interpreting Self, and VII) Implications. The book seems dedicated to solving the mystery of purpose: “What is purpose and how does it emerge from purposeless phenomena?” [p. 3]. There will be no purpose without selves that any purpose must be for. Therefore, Sherman reformulates purposes as aims and selves. Since only selves aim, and their origin and nature are inextricably linked, he argues that the real problem is that of how matter becomes mattering. This is also the problem of the origin of life. Thus, in order to know the nature of mind, we should begin with the origin of life. Both panpsychism and eliminativism, with respect to their top-down approaches, fall short of solving the problem. Sherman calls panpsychism ghost and eliminativism machine. Ghost cannot solve the problem because it just takes aims and selves for granted without explaining. Machine cannot solve the problems either because it tries to reduce them to physical processes. That is to say, one does not explain them while the other does not admit their existence. However, aims and selves are real but not mysterious or supernatural. Hence, Sherman thinks that the solution should be grounded in naturalism but not materialism [p. 107]. Materialism, which means that nothing exists except matter, is the basis for eliminativism. This is not what we want. Book Review 175 We also think that aims and selves must be explainable scientifically or will fall into ghost position. Thus it is naturalism. Then, how can we pursue the solution without falling into any one of the two positions? Sherman argues that the first thing we should do is to change the methodology. The reasons that the top-down approach falls into either ghost or machine are that it falls into a reverse-engineering fallacy. Engineering always starts from “prescription through explanation to description” [p. 97] while science should move “sequentially from description through explanation to prescription” [p. 97]. It becomes a fallacy when doing science in the engineering way. While this is what happened to sciences dedicated to studying mind, like AI, cognitive science, computer science, and so forth, it is a fallacy because function is multiply realized as philosophers of mind had discovered long before. It means that the specific mechanism an engineer would invent to realize the function may not correspond with the actual one. This is what is sometimes called the positive way of thinking. It treats aims and selves as something more than matter and should find the mechanism to make it more likely to happen. If we want to disentangle the mystery, Sherman argues, we should think about it negatively. Rather than concentrating on how to make it more likely, we should focus on what are eliminated. According to the negative thinking which focuses on what is absent rather than what is presented, selves are not something from matter or the dynamic possibilities of matter. Different from a top-down approach (like entelechy, vital force, agency, etc.) that selves are something added on to selfless things, Sherman argues that selves are always possible, but the probability of actualizing them is rare. Then the problem is that how the probability of the possibility of the emergence of self can be improved to be 1, namely how self emerges from the state where self is rarely probable. The secret to solve the problem lies in the elimination of other possibilities according to negative thinking: the state presented comes from what is prevented, namely via constraints. The most counterintuitive feature of life is that it seems to violate the second law of thermodynamics. This is the so-called Clausius-Darwin Paradox. According to the second law proposed by Rudolf Clausius, an isolated system has a tendency to thermal equilibrium, the state with a maximum of entropy or disorder; while the forms of life evolve to be more and more elaborated by natural selection discovered by Charles Darwin. It is a paradox because, in Sherman’s term, a thermodynamic process is a process towards irregularity while evolution is an amplifying process of self- regeneration, or living systems which truly have a self. Then, the mystery of purpose becomes that how self-regeneration emerges from irregularity. This is done through emergent elimination of or narrowing down possibilities through constraint. Suppose there is an isolated system with massive elements which is in thermal equilibrium. It means that the freedom of each element in the system is maximal. “It’s just all possibilities equally presented” [p. 118]. As a result, the probability of the actualization of a specific state of each element—which is in inverse proportion of the number of possible states of each element—is very low. Therefore, it is possible for the whole system to deviate from thermal equilibrium but the probability, which 176 Liqian Zhou equals to the multiplied probabilities of the specific states of the elements if each element is independent of each other, is extremely low.
Recommended publications
  • The Teleodynamics of Language, Culture, Technology and Science (LCT&S)
    Information 2013, 4, 94-116; doi:10.3390/info4010094 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Review The Teleodynamics of Language, Culture, Technology and Science (LCT&S) Robert K. Logan 1,2 1 Department of Physics, University of Toronto, 60 Street George, Toronto, ON M5S 1A7, Canada; E-Mail: [email protected]; Tel.: +1-416-361-5928 2 Strategic Innovation Lab OCAD University, Toronto, ON M5T 1W1, Canada Received: 8 November 2012; in revised form: 30 January 2013 / Accepted: 2 February 2013 / Published: 7 February 2013 Abstract: Logan [1] in his book The Extended Mind developed the hypothesis that language, culture, technology and science can be treated as organisms that evolve and reproduce themselves. This idea is extended by making use of the notion of teleodynamics that Deacon [2] introduced and developed in his book Incomplete Nature to explain the nature of life, sentience, mind and a self that acts in its own interest. It is suggested that language, culture, technology and science (LCT&S) like living organisms also act in their own self-interest, are self-correcting and are to a certain degree autonomous even though they are obligate symbionts with their human hosts. Specifically, it will be argued that LCT&S are essentially teleodynamic systems, which Deacon defines as “self-creating, self-maintaining, self-reproducing, individuated systems [2] (p. 325)”. Keywords: language; culture; technology; science; teleodynamics; morphodynamics; thermodynamics; organism; obligate symbiont 1. Introduction Although [teleodynamics] is the distinguishing characteristic of living processes, it is not necessarily limited to the biological—Deacon. Terrence Deacon [2] in his book, Incomplete Nature: How Mind Emerged from Matter attempts to develop a scientific theory of how properties such as information, value, purpose, meaning, and end-directed behavior emerged from physics and chemistry.
    [Show full text]
  • A DEFENSE of EQUILIBRIUM REASONING in ECONOMICS by Jennifer Soyun Jhun BA in Philosophy and Economics, Northwestern University
    A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS by Jennifer Soyun Jhun BA in Philosophy and Economics, Northwestern University, 2008 Submitted to the Graduate Faculty of The Kenneth P. Dietrich School of Arts and Sciences in partial fulfillment of the requirements for the degree of Doctor of Philosophy University of Pittsburgh 2016 UNIVERSITY OF PITTSBURGH KENNETH P. DIETRICH SCHOOL OF ARTS AND SCIENCES This dissertation was presented by Jennifer Soyun Jhun It was defended on May 27, 2016 and approved by Robert Batterman, Professor of Philosophy, University of Pittsburgh Sheldon Smith, Professor of Philosophy, University of California, Los Angeles Dissertation Advisor: Mark Wilson, Distinguished Professor of Philosophy, University of Pittsburgh Dissertation Advisor: James Woodward, Distinguished Professor of History and Philosophy of Science, University of Pittsburgh ii A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS Jennifer Jhun, PhD University of Pittsburgh, 2016 Copyright © by Jennifer Jhun 2016 iii A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS Jennifer Soyun Jhun, PhD University of Pittsburgh, 2016 Critics both within and outside of philosophy have challenged economics wholesale as unscientific. In particular, economics seems unable to predict future events because it relies on assumptions like equilibrium conditions, which stipulate that the economy tends to stay in its current state absent external forces. The popular background view that gives rise to this criticism is that the job of science is to uncover laws of nature, by appeal to which we can determine (usually deductively) the future behavior of a dynamical system as it evolves. I argue that lawlike statements in economics have a very different role than this: they provide a means of understanding in terms of how efficient a particular system is.
    [Show full text]
  • Macroscopic Time Evolution and Maxent Inference for Closed
    Macroscopic time evolution and MaxEnt inference for closed systems with Hamiltonian dynamics Domagoj Kui´c,∗ Paˇsko Zupanovi´c,ˇ † and Davor Jureti´c‡ University of Split, Faculty of Science, N. Tesle 12, 21000 Split, Croatia Abstract MaxEnt inference algorithm and information theory are relevant for the time evolution of macroscopic systems considered as problem of incomplete information. Two different MaxEnt approaches are introduced in this work, both applied to prediction of time evolution for closed Hamiltonian systems. The first one is based on Liouville equation for the conditional probability distribution, introduced as a strict microscopic constraint on time evolution in phase space. The conditional probability distribution is defined for the set of microstates associated with the set of phase space paths determined by solutions of Hamilton’s equations. The MaxEnt inference algorithm with Shannon’s concept of the conditional information entropy is then applied to prediction, consistently with this strict microscopic constraint on time evolution in phase space. The second approach is based on the same concepts, with a difference that Liouville equation for the conditional probability distribution is introduced as a macroscopic constraint given by a phase space average. We consider the incomplete nature of our information about microscopic dynamics in a rational way that is consistent with Jaynes’ formulation of predictive statistical mechanics, and the concept of macroscopic reproducibility for time dependent processes. Maximization of the conditional information entropy subject to this macroscopic constraint leads to a loss of correlation between the initial phase space paths and final microstates. Information entropy is the theoretic upper bound on the conditional information entropy, with the upper bound attained only in case of the complete loss of correlation.
    [Show full text]
  • 945 the Transition from Constraint To
    [Frontiers in Bioscience 19, 945-957, June 1, 2014] The transition from constraint to regulation at the origin of life Terrence W. Deacon1, Alok Srivastava1, Joshua Augustus Bacigalupi1 1Department of Anthropology, University of California, Berkeley, CA 94720 TABLE OF CONTENTS 1. Abstract 2. Introduction 2.1. Living thermodynamics 2.2. Context-limited versus self-limiting dissipation 2.3. The autopoietic dilemma 2.4. The formal versus physical logic of biological regulation 3. Autogenesis 3.1. Constraints on constraint-production 3.2. From synergistic constraint generation to regulation 4. Conditional autogenesis 4.1. The emergence of cybernetic regulation 4.2. Template regulated autogenesis: an imaginative scenario 5. Conclusions 6. References 1. ABSTRACT 2. INTRODUCTION The origin of living dynamics required a local 2.1. Living thermodynamics evasion of thermodynamic degradation by maintaining Living organisms are thermodynamically and critical dynamical and structural constraints. Scenarios for biochemically open physicochemical systems. They life’s origin that fail to distinguish between constrained constantly exchange matter and energy with their chemistry and regulated metabolism do not address the physicochemical environment and yet are constrained question of how living processes first emerge from simpler within physical boundaries and structures that are constraints on molecular interactions. We describe a maintained through dynamic processes. The molecular model system consisting of coupled reciprocal physicochemical processes that constitute living organisms catalysis and self-assembly in which one of the catalytic bi- tend to persist in states maintained far from thermodynamic products tends to spontaneously self-assemble into a and chemical equilibrium, whereas non-living containing shell (analogous to a viral capsule).
    [Show full text]
  • Thermodynamics of Surface Phenomena
    THERMODYNAMICS OF SURFACE PHENOMENA J. C. MELROSE Mobil Research and Development Corporation, P.O. Box 900, Dallas, Texas 75221, U.S.A. ABSTRACT The thermodynamic treatment of the interfacial region corresponding to two fluid phases in contact is discussed. The features of this analysis which are reviewed include the classical treatment of Gibbs and the extensions to this treatment which are due to Buff. It is shown that these extensions are essential ifthe logical structure of the analysis is to be regarded as complete. 1. INTRODUCTION The thin, nonhomogeneous region separating two homogeneous bulk phases in contact constitutes an interface. It is generally recognized that an adequate thermodynamic treatment of such a region must be based on 1 the work of Gibbs • N evertheless, the Iiterature contains a number of proposals for modifying various features ofthis treatment. It is, in fact, remarkable that no other important contribution of Gibbs to the understanding of the equilibrium states of heterogeneous substances has given rise to so many reservations and attempts to develop alternative treatments. The proposed modifications are usually concerned with one or more of several concepts which are characteristic of the Gibbsian treatment. The first ofthese concepts involves the notion of a mathematical dividing or reference surface, located within or very near the nonhomogeneous interfacial region. This surface serves to define the geometrical configuration of the interfacial region and also partitions the volume of the system between the two bulk phases. A second feature of the Gibbs treatment which is occasionally challenged is the definition of the chemical potentials which are appropriate to the components present in an interfacial region.
    [Show full text]
  • UC Berkeley UC Berkeley Electronic Theses and Dissertations
    UC Berkeley UC Berkeley Electronic Theses and Dissertations Title Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints Permalink https://escholarship.org/uc/item/1sx6w3qg Author Albanna, Badr Faisal Publication Date 2013 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints by Badr Faisal Albanna A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Physics in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Michael R. DeWeese, Chair Professor Ahmet Yildiz Professor David Presti Fall 2013 Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints Copyright 2013 by Badr Faisal Albanna 1 Abstract Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints by Badr Faisal Albanna Doctor of Philosophy in Physics University of California, Berkeley Professor Michael R. DeWeese, Chair Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. In this dissertation, I provide lower and upper bounds on the entropy for both the minimum and maximum entropy distributions over binary units with any fixed set of mean values and pairwise correlations, and we construct distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size, unlike the possible linear behavior of the maximum entropy solution, for any set of first- and second-order statistics consistent with arbitrarily large systems.
    [Show full text]
  • Przemysław Żywiczyński Center for Language Evolution Studies, Nicolaus Copernicus University [email protected]
    THEORIA ET HISTORIA SCIENTIARUM, VOL. XVI Ed. Nicolaus Copernicus University 2019 DOI: http://dx.doi.org/10.12775/ths.2019.010 Przemysław Żywiczyński Center for Language Evolution Studies, Nicolaus Copernicus University [email protected] Biological Evolution, Cultural Evolution and the Evolution of Language. Review of Daniel Dennett’s From Bacteria to Bach and Back Abstract. Daniel Dennett is one of the giants of contemporary philosophy. His new book, From Bacteria to Bach and Back, does reiterates the old motifs, such as “strange inversion of reasoning” or “production without comprehension”. But it is first and foremost a new project, whose goal is to calibrate the theory of universal Darwinism to the very recent developments in science, technology and our lifestyles, the most important of which is the coming of Artificial Intelligence. What Dennett does in the new book offers us “thinking tools” (his own phrase) to understand this changing reality by means of basic Darwinian principles. Keywords: universal Darwinism; AI; cognitive science; Darwinian Spaces; biological evolution; cultural evolution; evolution of language; memetics. Introduction Daniel Dennett is the type of writer who does not produce works of minor importance. All his books, starting with the collection of essays Brainstorms (1978), grapple with the grandest philosophical problems – mind and free will, morality or the nature and scope of evolutionary processes. However, some of his titles stand out, not only due to the breath-taking range of subjects they cover and the subtlety of exposition these subjects are given, but also due to the impact they have exerted on contemporary 170 Przemysław Żywiczyński philosophical debate.
    [Show full text]
  • INCOMPLETE NATURE Open Call - Members’ Show, Interface
    INCOMPLETE NATURE Open Call - Members’ Show, Interface 14 – 24 September THEME The title Incomplete Nature is borrowed from the name of Terrence W Deacon’s book about how life and the mind emerged from inanimate matter and in which he attempts to approach values, purpose and meaning from a scientific perspective. The “Theory of Everything" that emerges from scientific investigations appears to include everything but the feelings, meanings, consciousness, and purposes that make us (and many other animals) what we are. These phenomena are left unexplained by the natural sciences because they lack the physical properties—such as mass, momentum, charge, and location—that are assumed to be necessary for something to have physical consequences in the world. Robert Logan, in his review of Deacon’s book writes: There is little doubt that the paradigm of reductive science does not and cannot explain the phenomena of life, sentience, mind, purpose, meaning and value. We have learned much about the operations of the physical brain, its neurons, its neural networks, its chemistry, and its bicameralism and yet we cannot connect these understandings with human behaviour, human will and human spirituality. Part of the new paradigm that Deacon is developing is the notion that biology in addition to being a physical and chemical science is also a semiotic science in which meaning plays an essential role in understanding living systems.1 Deacon argues that many phenomena with no clear physical explanation can still have causal influence in the world. “To
    [Show full text]
  • Towards Quantifying a Wider Reality: Shannon Exonerata
    Information 2011, 2, 624-634; doi:10.3390/info2040624 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Essay Towards Quantifying a Wider Reality: Shannon Exonerata Robert E. Ulanowicz 1,2 1 Department of Biology, University of Florida, Gainesville, FL 32611-8525, USA; E-Mail: [email protected] 2 University of Maryland Center for Environmental Science, Solomons, MD 20688-0038, USA Received: 14 July 2011; in revised form: 14 September 2011 / Accepted: 26 September 2011 / Published: 25 October 2011 Abstract: In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis) of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality. Keywords: apophasis; constraint; entropy; flexibility; hegelian dialectic; information; meaning; positivism; probability; sustainability 1. A World with Absences The most important thing about information theory is not information.
    [Show full text]
  • The Math Is Not the Territory: Navigating the Free Energy Principle
    The Math is not the Territory: Navigating the Free Energy Principle Mel Andrews October 8th, 2020 Contents 1 Abstract 2 2 Introduction 3 3 The Free Energy Principle 4 3.1 History of the Formalism . 4 3.1.1 The Epistemic Turn in Statistical Mechanics . 5 3.1.2 The Mean Field Approximation . 6 3.1.3 Free Energy in Machine Learning . 6 3.1.4 Variational Bayes . 7 3.1.5 Innovations in Friston's Free Energy Minimisation . 7 3.2 Fundamentals of the FEP . 7 4 Markov Blankets, Free Energy, & Generative Models 11 4.1 Markov Blankets . 11 4.2 Free Energy, Entropy . 14 4.3 Generative Models . 15 4.4 Recapitulation . 16 5 Reinterpreting the FEP 16 6 Models 19 6.1 The FEP as Scientific Model . 19 6.2 Normative & Process Models . 20 6.3 Origins of the Distinction in Mathematical Psychology . 20 6.4 Models in Philosophy of Science . 22 1 6.5 Exploratory Models . 23 6.6 Modelling with and without Specific Targets . 23 6.7 Targetless Models . 24 6.8 Models & Simulations . 25 6.9 Generic & Conceptual Models . 25 6.10 Generic Models . 26 6.11 Conceptual Models . 26 6.12 Alternative Epistemic Virtues . 27 6.13 Guides to Discovery . 27 6.14 Takeaways from the Modelling Literature . 28 7 Conclusion 29 8 Acknowledgements 30 9 References 30 1 Abstract The free energy principle (FEP) has seen extensive philosophical engagement| both from a general philosophy of science perspective and from the perspective of philosophies of specific sciences: cognitive science, neuroscience, and biology. The literature on the FEP has attempted to draw out specific philosophical commitments and entailments of the framework.
    [Show full text]
  • Special Systems Theory
    Special Systems Theory Kent D. Palmer http://kdp.me [email protected] 714-633-9508 Copyright 2013 Kent Palmer; Draft 0.7, 2013.4.8, reworked 2013.6.21 All rights reserved. Not for distribution CAS20130621paperkdp07a.docx http://kentpalmer.name Abstract: A new advanced systems theory concerning the emergent nature of the Social, Consciousness, and Life based on Mathematics and Physical Analogies is presented. This meta- theory concerns the distance between the emergent levels of these phenomena and their ultra- efficacious nature. The theory is based on the distinction between Systems and Meta-systems (organized Openscape environments). We first realize that we can understand the difference between the System and the Meta-system in terms of the relationship between a ‘Whole greater than the sum of the parts’ and a ‘Whole less than the sum of its parts’, i.e., a whole full of holes (like a sponge) that provide niches for systems in the environment. Once we understand this distinction and clarify the nature of the unusual organization of the Meta-system, then it is possible to understand that there is a third possibility which is a whole exactly equal to the sum of its parts that is only supervenient like perfect numbers. In fact, there are three kinds of Special System corresponding to the perfect, amicable, and sociable aliquot numbers. These are all equal to the sum of their parts but with different degrees of differing and deferring in what Jacques Derrida calls “differance”. All other numbers are either excessive (systemic) or deficient (metasystemic) in this regard. The Special Systems are based on various mathematical analogies and some physical analogies.
    [Show full text]
  • Eighteenth Annual Biosemiotics Gathering Abstract Booklet
    Eighteenth Annual Biosemiotics Gathering Abstract Booklet University of California, Berkeley June 17-20, 2018 Organized by Terrence Deacon and Yogi Hendlin and the International Society for Biosemiotic Ethics www.biosemiotics.life Victoria Alexander | [email protected] Dactyl Foundation New York Council for the Humanities “Eating and incorporation, from symbiogenesis to society” When asked for this panel to consider food from a biosemiotic perspective, I thought first of the fact that I, as a farmer, have eaten animals that I had raised, named, and communicated with before finally leading them away to slaughter. I begin, therefore, noting the irony of my claiming to be a humanitarian while also being somewhat red in tooth and claw. But that’s enough about my incisors. For this talk I would like to reflect on the process of eating as just one of the ways, from symbiogenesis to society, in which the other is incorporated into an ever-widening body. I consider the dehumanization, defined as the reduction of semiotic freedom, of the individual within a larger society. I think about how the individual always loses its creative potential when it becomes part of a whole. In the process, the individual gains other things too, of course, in particular, usefully constraining contexts, without which a self could not be differentiated. But there needs to be balance. The less semiotic freedom an individual human has, the easier it is for political/religious leaders to sacrifice him to the whole. I have spent a dozen years now trying to work out biosemiotic theory as it applies to science, philosophy and aesthetics.
    [Show full text]