How the Founder of Modern Dialectical Logic Could Help the Founder of Cybernetics Answer This Question
Total Page:16
File Type:pdf, Size:1020Kb
Industrial and Systems Engineering What Is Information After All? How The Founder Of Modern Dialectical Logic Could Help The Founder Of Cybernetics Answer This Question EUGENE PEREVALOV1 1Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA, USA ISE Technical Report 21T-004 What is information after all? How the founder of modern dialectical logic could help the founder of cybernetics answer this question E. Perevalov Department of Industrial & Systems Engineering Lehigh University Bethlehem, PA 18015 Abstract N. Wiener’s negative definition of information is well known: it states what infor- mation is not. According to this definition, it is neither matter nor energy. But what is it? It is shown how one can follow the lead of dialectical logic as expounded by G.W.F. Hegel in his main work [1] – “The Science of Logic” – to answer this and some related questions. 1 Contents 1 Introduction 4 1.1 Why Hegel and is the author an amateur Hegelian? . 5 1.2 Conventions and organization . 16 2 A brief overview of (Section I of) Hegel’s Doctrine of Essence 17 2.1 Shine and reflection . 19 2.2 Essentialities . 23 2.3 Essence as ground . 31 3 Matter and Energy 36 3.1 Energy quantitative characterization . 43 3.2 Matter revisited . 53 4 Information 55 4.1 Information quality and quantity: syntactic information . 61 4.1.1 Information contradictions and their resolution: its ideal universal form, aka probability distribution . 62 4.1.2 Form and matter II: abstract information, its universal form and quan- tity; Kolmogorov complexity as an expression of the latter . 67 4.1.3 Form and content II: abstract quantity of specific information, aka Kullback-Liebler divergence . 71 4.1.4 Abstract information quantity and inference: Maximum Entropy . 75 4.1.5 Information Theory . 80 4.2 Information essence, appearance and actuality. Semantic information . 89 4.2.1 Information essence . 90 4.2.2 Information appearance and actuality . 94 4.2.3 Semantic information and its quantity . 112 2 4.3 Objectivity of information and its forms . 118 4.3.1 The ideal vs the material or whose information is it? . 124 4.3.2 Maximum entropy revisited . 132 4.4 Information and knowledge . 137 4.5 Information paradoxes . 142 4.6 Philosophy of Information: where do we stand on its foundational questions? 152 5 Summary: space, time, matter, mass, energy, and information 172 A “Logical atomism” vs. rational dialectics: sublation or mere truncation? 181 A.1 B. Russell’s philosophical genesis . 181 A.2 The subject matter of logic . 189 A.3 Logical atomism: the original version . 199 A.4 L. Wittgenstein: logical atomism as a system . 220 A.5 Summary: the paradox of anti-philosophical philosophy . 242 B 20th century theoretical physics between dialectics and metaphysics 263 B.1 A famous example . 266 B.2 A brief logical aside on space and time . 285 B.3 Relativity revisited . 297 B.4 Mathematical neoplatonism as an easy way out of the objective-subjective contradiction . 312 B.5 Logico-philosophical void and “crazy theories” as its fillers . 338 B.6 Physicists and philosophy . 360 B.7 A cursory look at the technical progress . 383 Bibliography 392 3 1 Introduction It can be considered by now indisputable that the general notion of information is central to pretty much all of modern science and engineering. Somewhat surprisingly however, the proper concept of information is not available yet. In other words, information still remains somewhat of a mystery as far as its true nature is concerned. Most people – professional scientists and philosophers included – can recognize information when they see it, but would in general find themselves at a loss if asked to explain what it really is. Academic philosophy recognized this situation by spawning from within itself a growing sub-area known as the philosophy of information (PI) – see, for example, [2] for an introduction. One of the main founders of the new field, L. Floridi, enumerated eighteen problems, so that making progress in solving any of these would advance the field. The first of these problems, quite naturally, is simply: What is information? L. Floridi characterizes this particular problem in the following words [3]: This is the hardest and most central question in PI. Information is still an elusive concept. This is a scandal not in itself but because so much basic theoretical work relies on a clear analysis and explanation of information and of its cognate concepts. We know that information ought to be quantifiable, additive, storable and transmittable. But apart form this, we still do not seem to have a much clearer idea about its specific nature. This is indeed a very fair assessment of the current state of affairs which – as noted in the quotation just given – has somewhat of a scandalous air about it. One would definitely be justified in stating that making some amends in this regard would be rather desirable. Respectively, the main goal of this article is to use the logical machinery developed by G.W.F. Hegel (as expounded primarily in his main work [1]) to make progress in this par- ticular direction. During Hegel’s times, the term “information” was not yet in wide use (for example, it is not used a single time in “The Science of Logic”), and Hegel himself had little to say about the corresponding concept. Our intention is to employ “The Science of Logic” methodology to process the knowledge available now to arrive at a rational understanding of its true nature. If one goes back to the more recent history of science in search of fundamental progress in understanding of various aspects of information, the names of C. Shannon and N. Wiener, the founding fathers of the modern Information Theory and Cybernetics, respectively, im- mediately come to mind. C. Shannon, apparently, held a rather pessimistic view on the existence of a general concept of information [4], p.180: 4 The word ‘information’ has been given different meanings by different writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field. It is interesting to note that apparently C. Shannon himself was somewhat dissatisfied with the name (Information Theory) that his theory became known under. The reason was that both him and his prominent collaborator W. Weaver believed that what had been developed could better be described as the Mathematical Theory of Communication [5], and that the term “Information Theory” was unjustifiably too broad. N. Wiener is credited with the following rather famous sentence [6]: Information is information, not matter or energy. This sentence can be considered a negative definition of information. In our view, it is fully correct, and thus will be made the starting point of our discussion. In this quotation, N. Wiener acknowledges that he does not know what information is, but states what in- formation is not (it is neither matter nor energy). Moreover, the real beauty of Wiener’s negative definition lies just a bit below its surface. Specifically, it implies that the concept of information is likely as fundamental as those of matter and energy and thus provides a hint for those looking for clarity in these fundamental matters: to understand the true nature of information – taken in its most general sense – one would be well served by starting from those of matter and energy. Our task here will be relatively straightforward precisely due to the previous contributions of the two great thinkers: G.W.H. Hegel and N. Wiener. We will simply take the direction provided by the latter and move along using the tools developed by the former, the level of our own efforts being rather modest as a result. 1.1 Why Hegel and is the author an amateur Hegelian? It is going to become clear very soon that, methodologically speaking, a very extensive use of Hegel’s logical system expounded in his most important work, “The Science of Logic” [1], is made. An obvious question any inquisitive reader is likely to ask is what makes Hegel’s philosophy (since it is well known that he is exclusively a philosopher) in general, and “The Science of Logic” in particular, so special as to be chosen the main methodological foundation for solving puzzles of information, especially since that book was written about two centuries 5 ago. To explain such a seemingly exotic methodological choice, a small digression into the micro-history (not in the sense of being extremely detailed but in that of being very insignificant) of the present article is in order. The author has to confess right away to not being a professional philosopher, but rather an industrial engineer with a specific interest in information/probability aspect of the field, and some prior experience of a physicist. The original motivation that eventually gave rise to the present article was that of wishing to develop a general method – a “mathematical framework” in the language preferred at the time of the original motivation occurrence – for optimal extraction of additional information in optimization and decision making under uncertainty. With respect to the existing Shannon’s Information Theory, the method to be developed was being envisioned as its complement and extension: while Information Theory provides a general method for information optimal transmission, the method to be developed would address the “end links” of the “information chain” – the chain that begins with obtaining of information and ends with its usage for solving a problem.