Algebraic Information Theory

Total Page:16

File Type:pdf, Size:1020Kb

Algebraic Information Theory Statistical Information Theory The Need for other Information Theories Algebraic Information Theory Algebraic Information Theory Marc Pouly [email protected] Apsia Breakfast Seminar Interdisciplinary Centre for Security, Reliability and Trust University of Luxembourg June 2011 Marc Pouly Algebraic Information Theory 1/ 25 Statistical Information Theory Hartley’s Information Theory The Need for other Information Theories Shannon’s Information Theory Algebraic Information Theory Hartley’s Measure (1928) Given a set S = fs1;:::; sng how can we measure its uncertainty u(S) 1 uncertainty is a non-negative value 2 monotone: jS1j ≤ jS2j ) u(S1) ≤ u(S2) 3 additive: u(S1 × S2) = u(S1) + u(S2) Theorem: There is only one function that satisfies these properties u(S) = log jSj Marc Pouly Algebraic Information Theory 2/ 25 Statistical Information Theory Hartley’s Information Theory The Need for other Information Theories Shannon’s Information Theory Algebraic Information Theory From Uncertainty to Information The uncertainty of S = fs1;:::; s8g is log 8 = 3 bits Assume now that someone gives more precise information for example that either s3 or s7 has been transmitted 0 We have S = fs3; s7g with a remaining uncertainty log 2 = 1 bit The information reduces uncertainty by log 8 − log 2 = 2 bits Information is the Reduction of Uncertainty ! Marc Pouly Algebraic Information Theory 3/ 25 Statistical Information Theory Hartley’s Information Theory The Need for other Information Theories Shannon’s Information Theory Algebraic Information Theory Shannon’s Measure (1948) How much uncertainty is contained in a set S = fs1;:::; sng if the probability pi = p(si ) of each element is known? n X S(p1;:::; pn) = − pi log pi i=1 We have a similar uniqueness result for specific properties An information theory is derived by the same principle This is what people call classical or statistical information theory 1 1 Shannon generalizes Hartley: S( n ;:::; n ) = log n Marc Pouly Algebraic Information Theory 4/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory How do we represent Information Thanks to Hartley we have an information theory for sets and thanks to Shannon an information theory for probabilities But there are other ways of representing information (on computers) databases (relations), systems of equations and inequalities, constraint systems, possibilistic formalisms, formalisms for imprecise probabilities, Spohn potentials, graphs, logics, ... Statistical information theory is not enough ! Marc Pouly Algebraic Information Theory 5/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory Extending Hartley’s Theory Alphabets Hartley's Information Theory Probabilistic Sources Isomorphisms Shannon's Information Theory Relational Information Theory Marc Pouly Algebraic Information Theory 6/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory The Fundamental Theorem of Lattice Theory Hartley’s information theory assumes a finite alphabet S and assigns values to subsets, i.e. u : P(S) ! R≥0. P(S) is a distributive lattice with meet \ and join [ Theorem (Fundamental Theorem of Lattice Theory) Every distributive lattice is isomorphic to a lattice of subsets We can carry over Hartley’s measure to isomorphic formalisms for example to the relational algebra used in databases Marc Pouly Algebraic Information Theory 7/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory Relational Information Theory We can therefore measure the uncertainty in relations Destination Departure Gate Heathrow 10:00 7 Heathrow 14:00 9 R = Gatwick 08:30 4 City 11:15 5 City 15:20 7 and obtain u(R) = log 5 bits If we agree on the three properties stated by Hartley then u(S) = log jSj is the only correct way of measuring uncertainty in subset systems and hence also the right way for isomorphic formalisms such as the relational algebra. Marc Pouly Algebraic Information Theory 8/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory Duality of Information Destination Departure Gate Heathrow 10:00 7 Heathrow 14:00 9 R = Gatwick 08:30 4 City 11:15 5 City 15:20 7 1 How to get to London ? the more tuples the more information 2 I am waiting for my friend, which flight might she have taken ? the less tuples the more information Such a dualism is always present in order theory but not for measures Is measuring information sometimes too restrictive ? Marc Pouly Algebraic Information Theory 9/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory Linear Equation Systems Solution set of linear equation systems form affine spaces X1 − 2X2 + 2X3 = −1 3X1 + 5X2 − 3X3 = 8 4X1 + 3X2 − X3 = 7 The null space of the equation matrix A is 4 9 N (A) = f(x ; x ; x ) 2 3 : x = − x ; x = x g 1 2 3 R 1 11 3 2 11 3 How much uncertainty is contained in an equation system ? Can we treat this just as another subset system ? Marc Pouly Algebraic Information Theory 10/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory Equational Information Theory Linear equation systems can have no, one or infinitely many solutions. Hence, the uncertainty is either log 0, log 1 or log 1 Here, a (quantitative) measure of information is not appropriate Marc Pouly Algebraic Information Theory 11/ 25 Statistical Information Theory Extending Hartley’s Theory The Need for other Information Theories Relational Information Theory Algebraic Information Theory Equational Information Theory A first Summary A theory of information should explain what information is Hartley & Shannon: information = reduction of uncertainty Rely on the assumption that information can be measured There are many formalisms for representing information on computers that are not covered by this theory Does this theory reflect our daily perception of information ? What is our perception of information ? Marc Pouly Algebraic Information Theory 12/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory What is Information ? ... information exists in pieces ... information comes from different sources ... information refers to questions ... pieces of information can be combined ... information can be focussed on the questions of interest Marc Pouly Algebraic Information Theory 13/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory Towards a formal Framework information exists in pieces φ, 2 Φ there is a universe of questions r and every piece of information φ 2 Φ refers to a finite set of questions d(φ) ⊆ r combination of information φ ⊗ #y focussing of information if d(φ) = x and y ⊆ x then φ 2 Φ Marc Pouly Algebraic Information Theory 14/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory ... and the same again for nerds ... This is a two-sorted algebra (Φ; r) with universe of questions r and information pieces Φ labeling operator d :Φ !P(r) combination operator ⊗ :Φ × Φ ! Φ focussing operator #:Φ × P(r) ! Φ But operations cannot be arbitrary - they must satisfy some rules ! Marc Pouly Algebraic Information Theory 15/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory Axioms of Information 1 it should not matter in which order information is combined φ ⊗ = ⊗ φ and (φ ⊗ ) ⊗ ν = φ ⊗ ( ⊗ ν) 2 a combination refers to the union of the question sets d(φ ⊗ ) = d(φ) [ d( ) 3 focussing information on x ⊆ d(φ) gives information about x d(φ#x ) = x 4 focussing can be done step-wise, i.e. if x ⊆ y ⊆ d(φ) φ#x = (φ#y )#x 5 combining a piece of information with a part of itself gives nothing new φ ⊗ φ#x = φ Marc Pouly Algebraic Information Theory 16/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory The Combination Axiom How shall ⊗ and # behave with respect to each other ? 6 If φ, 2 Φ with d(φ) = x and d( ) = y then (φ ⊗ )#x = φ ⊗ #x\y Compare with the distributive law: (a × b) + (a × c) = a × (b + c) Definition (Kohlas, 2003) A system (Φ; r) satisfying the six axioms is called information algebra Marc Pouly Algebraic Information Theory 17/ 25 Statistical Information Theory Information Algebras The Need for other Information Theories Examples Algebraic Information Theory Algebraic Information Theory Relational Databases Relations are pieces of information Player Club Goals Player Nationality Ronaldinho
Recommended publications
  • Specification of Agent Explicit Knowledge in Cryptographic
    International Journal of Electrical and Computer Engineering 4:2 2009 Specification of Agent Explicit Knowledge in Cryptographic Protocols Khair Eddin Sabri, Ridha Khedri, and Jason Jaskolka Department of Computing and Software Faculty of Engineering McMaster University {sabrike, khedri, jaskolj}@mcmaster.ca Abstract— Cryptographic protocols are widely used in various The procedural knowledge involves a set of mecha- applications to provide secure communications. They are usually nisms/functions that enables an agent to obtain new informa- represented as communicating agents that send and receive messages. tion from its explicit knowledge. For example, if the explicit These agents use their knowledge to exchange information and communicate with other agents involved in the protocol. An agent knowledge of an agent contains an encrypted message as well knowledge can be partitioned into explicit knowledge and procedural as the key and the cipher used to decrypt the message, then by knowledge. The explicit knowledge refers to the set of information using the procedural knowledge, the secret can be obtained. which is either proper to the agent or directly obtained from other In this paper, we focus only on the explicit knowledge. agents through communication. The procedural knowledge relates to Both parts of the agent knowledge are necessary in ana- the set of mechanisms used to get new information from what is already available to the agent. lyzing cryptographic protocols. When we analyze the liter- In this paper, we propose a mathematical framework which spec- ature using this classification of agent knowledge, we find ifies the explicit knowledge of an agent involved in a cryptographic several representations of that knowledge.
    [Show full text]
  • Non-Locality, Contextuality and Valuation Algebras: a General Theory of Disagreement Samson Abramsky1, Giovanni Carù1
    Non-locality, contextuality and valuation algebras: a general theory of disagreement Samson Abramsky1, Giovanni Carù1 1Department of Computer Science, University of Subject Areas: Oxford, Wolfson Building, Parks Road, Oxford OX1 Quantum information, quantum 3QD, U.K. foundations, generic inference Keywords: We establish a strong link between two apparently Contextuality, generic inference, unrelated topics: the study of conflicting information valuation algebras, algorithms in the formal framework of valuation algebras, and the phenomena of non-locality and contextuality. In particular, we show that these peculiar features of Email addresses: quantum theory are mathematically equivalent to a Samson Abramsky general notion of disagreement between information [email protected] sources. This result vastly generalises previously observed connections between contextuality, relational databases, constraint satisfaction problems, and logical paradoxes, and gives further proof that contextual behaviour is not a phenomenon limited to quantum physics, but pervades various domains of mathematics and computer science. The connection allows to translate theorems, methods and algorithms from one field to the other, and paves the way for the application of generic inference algorithms to study contextuality. 1. Introduction Non-locality and contextuality are characteristic features of quantum theory which have been recently proven to play a crucial role as fundamental resources for quantum information and computation [1,2]. In 2011, Abramsky and Brandenburger introduced an abstract mathematical framework based on sheaf theory to describe these phenomena in a unified treatment, thereby providing a common general theory for the study of non-locality and contextuality, which had been carried out in a rather arXiv:1911.03521v1 [quant-ph] 8 Nov 2019 concrete, example-driven fashion until then [3].
    [Show full text]
  • On Homomorphism of Valuation Algebras∗
    Communications in Mathematical Research 27(1)(2011), 181–192 On Homomorphism of Valuation Algebras∗ Guan Xue-chong1,2 and Li Yong-ming3 (1. College of Mathematics and Information Science, Shaanxi Normal University, Xi’an, 710062) (2. School of Mathematical Science, Xuzhou Normal University, Xuzhou, Jiangsu, 221116) (3. College of Computer Science, Shaanxi Normal University, Xi’an, 710062) Communicated by Du Xian-kun Abstract: In this paper, firstly, a necessary condition and a sufficient condition for an isomorphism between two semiring-induced valuation algebras to exist are presented respectively. Then a general valuation homomorphism based on different domains is defined, and the corresponding homomorphism theorem of valuation algebra is proved. Key words: homomorphism, valuation algebra, semiring 2000 MR subject classification: 16Y60, 94A15 Document code: A Article ID: 1674-5647(2011)01-0181-12 1 Introduction Valuation algebra is an abstract formalization from artificial intelligence including constraint systems (see [1]), Dempster-Shafer belief functions (see [2]), database theory, logic, and etc. There are three operations including labeling, combination and marginalization in a valua- tion algebra. With these operations on valuations, the system could combine information, and get information on a designated set of variables by marginalization. With further re- search and theoretical development, a new type of valuation algebra named domain-free valuation algebra is also put forward. As valuation algebra is an algebraic structure, the concept of homomorphism between valuation algebras, which is derived from the classic study of universal algebra, has been defined naturally. Meanwhile, recent studies in [3], [4] have showed that valuation alge- bras induced by semirings play a very important role in applications.
    [Show full text]
  • ONTOLOGICAL ENGINEERING for PUBLIC and PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement
    EUROPEAN UNION BRAZIL ONTOLOGICAL ENGINEERING FOR PUBLIC AND PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement ONTOLOGICAL ENGINEERING FOR PUBLIC AND PRIVATE PROCUREMENT Ontological Engineering for Public and Private Procurement MINISTRY OF GOVERNMENT SECRETARIAT MINISTER OF STATE GEDDEL VIEIRA LIMA SECRETARY OF MICRO AND SMALL ENTERPRISES JOSÉ RICARDO DA VEIGA DIRECTOR OF MARKETS AND INNOVATION ALEXANDRE MONTEIRO E SILVA TECHNICAL SUPPORT TEAM FABIO MEDEIROS DE SOUZA (GENERAL-COORDINATOR OF MARKETS ACCESS) CARLOS VELOSO DE MELO JR. (FOREIGN TRADE ANALYST) JANIO MOREIRA DA COSTA (INFORMATION TECHNOLOGY ANALYST) EUROPEAN UNION DELEGATION TO BRAZIL HEAD OF THE EUROPEAN UNION DELEGATION JOÃO GOMES CRAVINHO MINISTER COUNSELLOR - HEAD OF DEVELOPMENT AND COOPERATION SECTION THIERRY DUDERMEL COOPERATION ATTACHÉ – EU-BRAZIL SECTOR DIALOGUES SUPPORT FACILITY COORDINATOR ONTOLOGICAL ASIER SANTILLAN LUZURIAGA IMPLEMENTING CONSORTIUM CESO DEVELOPMENT CONSULTANTS/FIIAPP/INA/CEPS ENGINEERING FOR MINISTRY OF PLANNING, DEVELOPMENT AND MANAGEMENT PUBLIC AND PRIVATE MINISTER OF STATE DYOGO OLIVEIRA SECRETARY OF MANAGEMENT PROCUREMENT GLEISSON CARDOSO RUBIN PROJECT NATIONAL DIRECTOR MARCELO MENDES BARBOSA CONTACTS PROJECT COORDINATION UNIT Authors: EUROPEAN UNION-BRAZIL SECTOR DIALOGUES SUPPORT FACILITY SECRETARIAT OF PUBLIC MANAGEMENT Edilson Ferneda MINISTRY OF PLANNING, DEVELOPMENT AND MANAGEMENT [email protected] TELEPHONE: + 55 61 2020.4645/4168/4785 Fernando William Cruz [email protected] [email protected] WWW.SECTORDIALOGUES.ORG © 2016 EUROPEAN UNION Brasília • March • 2016 THE CONTENT OF THIS PUBLICATION DOES NOT REFLECT THE OFFICIAL OPINION OF THE EUROPEAN UNION AND THE BRAZILIAN GOVERNMENT. RESPONSIBILITY FOR THE INFORMATION AND VIEWS EXPRESSED THEREIN LIES ENTIRELY WITH THE AUTHOR. Ontological Engineering for Public and Private Procurement LIST OF FIGURES AND CHARTS Figure 1.1 Some of the main knowledge organization systems .......................
    [Show full text]
  • Logic and the Development of Programming Languages, 1930–1975
    LOGIC AND THE DEVELOPMENT OF PROGRAMMING LANGUAGES, 1930–1975 Peter Mark Priestley University College London Thesis submitted in fulfillment of requirements for the degree of PhD. May 2008 2 I, Peter Mark Priestley, confirm that the work presented in this thesis is my own. Where information has been derived from other sources, I confirm that this has been indicated in the thesis. Abstract Compared with the history of computing hardware, the history of software is in a relatively unde- veloped state. In particular, the history of programming languages still consists for the most part of technical accounts presenting a rather Whiggish perspective on developments. Given the importance of software in the contemporary world, however, it is important to develop a more sophisticated un- derstanding of the medium in which it is expressed. This thesis considers some aspects of this history with the aim of examining the influence of formal logic on the evolution of notations for expressing computer programs. It is argued that this was not a natural or inevitable application of theory to practice, as is sometimes suggested, but a complex and contingent process with a rich history of its own. Two introductory chapters discuss the work on computability carried out by logicians in the mid-1930s, and the controversial topic of the role of logic in the invention of the computer. The body of the thesis proceeds chronologically, considering machine codes, the introduction of higher level notations, structured programming and software engineering, and the early object-oriented languages. The picture that emerges is that formal logic was deliberately employed by programming lan- guage designers to provide a model for a theoretical understanding of programming languages and the process of program development.
    [Show full text]
  • The Mathematical Theory of Information
    Hecke, Oswald TeichmOller, Ernst Witt, are used for quantitative problems, Richard Courant, Edmund Landau, Fe- The Mathematical while hits (the number of correct clas- lix Hausdorff, Ernst Peschl, Paul Riebe- sifications), reliability, and nuts (von sell, Helmut Ulm, Alfred Stohr, Ernst Neumann's utility) appear in more qual- Zermelo, Gerhard Gentzen, Hans Pe- Theory of itative analyses. tersson, Erich K~ihler, and Wilhelm SOss. Although the author's ambition is to The names on this list range from the Information develop a (if not the) "mathematical the- committed Nazis, through non-Nazi by Jan Kahre ory of information," the embodiment is right wing nationalists, to naive, other- pre-mathematical. It is neither a naive worldly men who seemingly didn't BOSTON, KLUWER. 2002. 520 PP., US$50.O0 mathematical theory (as in naive set the- ISBN: 1-4020-7064-0 know much of what was going on, to ory) nor is it abused mathematics (in the those who stumbled into situations that REVIEWED BY CRISTIAN S. CALUDE sense of mathematics applied in mean- were beyond their control. The stories ingless ways). However, the mathemat- are gripping. In assessing guilt the ndoubtedly, the title of the book ical formalism is too rudimentary for a reader is hard pressed to decide with was well chosen: it is provoca- theory; I illustrate this point with two any certainty which people fall into U tive, promising, and full of infor- examples, the definitions of probability which categories. Teichm/_iller comes mation. Syntactically, the title can be and algorithmic complexity. The proba- off very badly, for example.
    [Show full text]
  • Continuity in Information Algebras 3
    August 28, 2018 22:6 Continuity CONTINUITY IN INFORMATION ALGEBRAS: A SURVEY ON THE RELATIONSHIP BETWEEN TWO TYPES OF INFORMATION ALGEBRAS XUECHONG GUAN∗ College of Mathematics and Information Science Shaanxi Normal University Xi’an, 710062, China [email protected] YONGMING LI College of Computer Science Shaanxi Normal University Xi’an, 710062, China [email protected] Received 24 June 2010 Revised (revised date) In this paper, the continuity and strong continuity in domain-free information algebras and labeled information algebras are introduced respectively. A more general concept of continuous function which is defined between two domain-free continuous information algebras is presented. It is shown that, with the operations combination and focusing, the set of all continuous functions between two domain-free s-continuous information algebras forms a new s-continuous information algebra. By studying the relationship between domain-free information algebras and labeled information algebras, it is demonstrated arXiv:1201.0414v1 [cs.AI] 2 Jan 2012 that they do correspond to each other on s-compactness. Keywords: domain-free continuous information algebra; labeled continuous information algebra; continuous function; compactness. 1. Introduction Inference under uncertainty is a common problem in the real world. Thus, for pieces of information from different sources, there always exist two fundamental aspects that to combine information and to exact information on a designated domain. Based on the above consideration, the valuation-based system (VBS) was first in- troduced by Shenoy. 1 Kohlas, in Ref. 2, has exactly introduced the concept of information algebra. We can see that information algebra is an algebraic structure links up with local computation and inference for treating uncertainty or, more ∗Work Address: College of Mathematic Science, Xuzhou Normal University, Xuzhou, 221116, China.
    [Show full text]
  • On the Structural Link Between Ontologies and Organised Data Sets on the STRUCTURAL LINK BETWEEN ONTOLOGIES AND
    On the Structural Link Between Ontologies and Organised Data Sets ON THE STRUCTURAL LINK BETWEEN ONTOLOGIES AND ORGANISED DATA SETS BY ALICIA MARINACHE1, B.Eng. a thesis submitted to the department of computing and software and the school of graduate studies of mcmaster university in partial fulfilment of the requirements for the degree of Master of Applied Science c Copyright by Alicia Marinache2, February 2016 All Rights Reserved Master of Applied Science (2016) McMaster University (Software Engineering) Hamilton, Ontario, Canada TITLE: On the Structural Link Between Ontologies and Organ- ised Data Sets AUTHOR: Alicia Marinache3 B.Eng. (Software Engineering) University Politechnica, Bucharest, Romania SUPERVISOR: Dr. Ridha Khedri NUMBER OF PAGES: x, 132 ii To my family To my late father, who would have loved to see me pursuing my dreams Abstract "Relationships are complicated, sometimes interesting and can often appear unpredictable. This is just the type of situation that mathematics can help you with." - John D. Barrow, 100 essential things you didn't know you didn't know The proposed work focuses on articulating a mathematical framework to capture the structure of an ontology and relate it to organised data sets. In the discussed frame- work, the ontology structure captures the mereological relationships between con- cepts. It also uses other relationships relevant to the considered domain of application. The organized dataset component of the framework is represented using diagonal-free cylindric algebra. The proposed framework, called the domain-information structure, enables us to link concepts to data sets through a number of typed data operators. The new framework enhances concurrent reasoning on data for knowledge generation, which is essential for handling big data.
    [Show full text]
  • LOGIC JOURNAL of the IGPL
    LOGIC JOURNAL LOGIC ISSN PRINT 1367-0751 LOGIC JOURNAL ISSN ONLINE 1368-9894 of the IGPL LOGIC JOURNAL Volume 22 • Issue 1 • February 2014 of the of the Contents IGPL Original Articles A Routley–Meyer semantics for truth-preserving and well-determined Łukasiewicz 3-valued logics 1 IGPL Gemma Robles and José M. Méndez Tarski’s theorem and liar-like paradoxes 24 Ming Hsiung Verifying the bridge between simplicial topology and algebra: the Eilenberg–Zilber algorithm 39 L. Lambán, J. Rubio, F. J. Martín-Mateos and J. L. Ruiz-Reina INTEREST GROUP IN PURE AND APPLIED LOGICS Reasoning about constitutive norms in BDI agents 66 N. Criado, E. Argente, P. Noriega and V. Botti 22 Volume An introduction to partition logic 94 EDITORS-IN-CHIEF David Ellerman On the interrelation between systems of spheres and epistemic A. Amir entrenchment relations 126 • D. M. Gabbay Maurício D. L. Reis 1 Issue G. Gottlob On an inferential semantics for classical logic 147 David Makinson R. de Queiroz • First-order hybrid logic: introduction and survey 155 2014 February J. Siekmann Torben Braüner Applications of ultraproducts: from compactness to fuzzy elementary classes 166 EXECUTIVE EDITORS Pilar Dellunde M. J. Gabbay O. Rodrigues J. Spurr www.oup.co.uk/igpl JIGPAL-22(1)Cover.indd 1 16-01-2014 19:02:05 Logical Information Theory: New Logical Foundations for Information Theory [Forthcoming in: Logic Journal of the IGPL] David Ellerman Philosophy Department, University of California at Riverside June 7, 2017 Abstract There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory.
    [Show full text]
  • Information Algebra System of Soft Sets Over an Initial Universe Set and a Parameter Set
    Information algebra system of soft sets Guan Xuechonga,∗, Li Yongmingb aCollege of Mathematical Science, Xuzhou Normal University, Xuzhou, 221116, China bCollege of Computer Science, Shaanxi Normal University, Xi’an, 710062, China Abstract Information algebra is algebraic structure for local computation and inference. Given an initial universe set and a parameter set, we show that a soft set system over them is an information algebra. Moreover, in a soft set system, the family of all soft sets with a finite parameter subset can form a compact information algebra. Keywords: soft set, complete lattice, information algebra, compact information algebra. 1. Introduction The information algebra system introduced by Shenoy [1] was inspired by the for- mulation of some basic axioms of local computation and inference under uncertainty [2]. It gives a basic mathematical model for treating uncertainties in information. Related studies [3, 4, 5] showed that the framework of information algebras covers many instances from constraint systems, Bayesian networks, Dempster-Shafer belief functions to relational algebra, logic and etc. Considering about the feasibility of information processing with computer, Kohlas [3, 4] presented a special information algebra with approximation structure called compact information algebra recently. On the other hand, Molodtsov [6] initiated a novel concept, which is called soft set, as a new mathematical tool for dealing with uncertainties [7]. In fact, a soft set is a parameterized family of subsets of a given universe set. The way of parameterization in problem solving makes soft set theory convenient and simple for application. Now arXiv:1201.2980v1 [cs.IT] 14 Jan 2012 it has been applied in several directions, such as operations research [8, 9], topology [10, 11, 12], universal algebra [13, 14, 15, 16], especially decision-making [17, 18, 19, 20, 21].
    [Show full text]
  • The Rényi Entropies Operate in Positive Semifields
    Article The Rényi Entropies operate in Positive Semifields Francisco J. Valverde-Albacete1,‡ ID , Carmen Peláez-Moreno2,‡* ID 1 Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Leganés 28911, Spain; [email protected] 2 Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Leganés 28911, Spain; [email protected] * Correspondence: [email protected]; Tel.: +34-91-624-8771 † These authors contributed equally to this work. Academic Editor: name Received: date; Accepted: date; Published: date Abstract: We set out to demonstrate that the Rényi entropies with shifted parameter r = a − 1 are better thought of as operating in a type of non-linear semiring called a positive semifield. We show how the Rényi’s postulates lead to Pap’s g-calculus where the functions carrying out the domain transformation are Rényi’s information function and its inverse. In its turn, Pap’s g-calculus under Rényi’s information function transforms the set of positive reals into a family of semirings where “standard” product has been transformed into sum and “standard” sum into a power-emphasized sum. Consequently, the transformed product has an inverse whence the structure is actually that of a positive semifield. Instances of this construction lead to idempotent analysis and tropical algebra as well as to less exotic structures. We conjecture that this is one of the reasons why tropical algebra procedures, like the Viterbi algorithm of dynamic programming, morphological processing, or neural networks are so successful in computational intelligence applications. But also, why there seem to exist so many procedures to deal with “information” at large.
    [Show full text]
  • An Algebraic Theory of Information: an Introduction and Survey
    Information 2014, 5, 219-254; doi:10.3390/info5020219 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Article An Algebraic Theory of Information: An Introduction and Survey Juerg Kohlas 1;* and Juerg Schmid 2 1 Department of Informatics, University of Fribourg, Bvd. de Perolles 90, CH-1700 Fribourg, Switzerland 2 Mathematical Institute, University of Bern, Sidlerstrasse 5, CH-3012 Bern, Switzerland; E-Mail: [email protected] * Author to whom correspondence should be addressed; E-Mail: [email protected]. Received: 22 November 2013; in revised form: 8 April 2014 / Accepted: 8 April 2014 / Published: 10 April 2014 Abstract: This review examines some particular, but important and basic aspects of information: Information is related to questions and should provide at least partial answers. Information comes in pieces, and it should be possible to aggregate these pieces. Finally, it should be possible to extract that part of a piece of information which relates to a given question. Modeling these concepts leads to an algebraic theory of information. This theory centers around two different but closely related types of information algebras, each containing operations for aggregation or combination of information and for extracting information relevant to a given question. Generic constructions of instances of such algebras are presented. In particular, the close connection of information algebras to logic and domain theory will be exhibited. Keywords: information; information theory; universal logic; domain theory; order theory 1. Introduction: Modeling Information The approach to the concept of information proposed here is based on the view that information comes in pieces, that information relates to questions and that it provides at least partial answers to questions.
    [Show full text]