KRNC: New Foundations for Permissionless Byzantine Consensus and Global Monetary Stability

Total Page:16

File Type:pdf, Size:1020Kb

KRNC: New Foundations for Permissionless Byzantine Consensus and Global Monetary Stability KRNC: New Foundations for Permissionless Byzantine Consensus and Global Monetary Stability Clinton Ehrlich1 Anna Guzova2 February 27, 2020 Version 1.5 It is not unlikely that a new security/trust protocol, with the magnitude of the influence from the invention of the public-key protocol, [will be] inspired by the study of animal communication networks. – Prof. Zhanshan Ma, Chinese Academy of Sciences, 2009 Abstract This paper applies biomimetic engineering to the problem of permissionless Byzantine consensus and achieves results that surpass the prior state of the art by four orders of magnitude. It introduces a biologically inspired asymmetric Sybil- resistance mechanism, Proof-of-Balance, which can replace symmetric Proof-of-Work and Proof-of-Stake weighting schemes. The biomimetic mechanism is incorporated into a permissionless blockchain protocol, Key Retroactivity Network Consensus (“KRNC”), which delivers ~40,000 times the security and speed of today’s decentralized ledgers. KRNC allows the fiat money that the public already owns to be upgraded with cryptographic inflation protection, eliminating the problems inherent in bootstrapping new currencies like Bitcoin and Ethereum. The paper includes two independently significant contributions to the literature. First, it replaces the non-structural axioms invoked in prior work with a new formal method for reasoning about trust, liveness, and safety from first principles. Second, it formalizes two simple but powerful exploits — book-prize attacks and pseudo-transfer attacks — that undermine the security guarantees of all prior permissionless ledgers. 1 Chief Computer Scientist, Krnc Inc.; Fmr. Visiting Researcher, MGIMO University, [email protected] 2 Lead Mathematician, Krnc Inc.; Fmr. Senior Applied Mathematician, AO UniCredit Bank 3 Proof-of-Balance comprises the subject matter of the following published patent applications: US 16/261, 478; PCT/US2019/015732. Part I: Concept 1. Introduction 1.1 Summary Reverse engineering biological systems has yielded rapid advancement in fields ranging from pharmacology and artificial intelligence to materials science and aerospace design. [1] This approach, known as biomimicry, allows humans to learn from and copy what are effectively “alien technologies” developed through billions of years of evolutionary optimization. It has long been hoped that biomimicry could be the key to a major leap forward in trust-minimized computation. In 2009, the same year that Bitcoin was released, one of the world’s few dual PhDs in computer science and biology predicted that adapting animal-communication techniques to fault-tolerant distributed systems could yield a breakthrough comparable to the invention of asymmetric encryption in the 1970s. [2] This paper vindicates that prediction: it adapts cue-authenticated biological signaling to construct the first asymmetric method of Sybil-resistance, which allows correct agents to verifiably retain control of a permissionless blockchain even if they are unable to match the adversary’s budget for an attack. Adding biomimetic cost asymmetry to off-the-shelf consensus algorithms unlocks a roughly 40,000-fold increase in reliability, speed, and scalability over symmetric weighting methods, such as Proof-of-Work and Proof-of-Stake, which require correct agents to expend more resources than their faulty counterparts. Those legacy technologies embody the “handicap principle,” a theory that originated in biology to explain the evolution of seemingly wasteful traits, like the oversized tails of male peacocks. According to the handicap principle, the reliability of a signal depends on its verifiable cost to the signaler. [3] Inside the Bitcoin community, this theory has been elevated to the status of a supposedly universal natural law, which applies with equal weight to biology and computer science. [4] In Proof-of-Work, a handicap is imposed by forcing consensus participants to expend computing power. In Proof-of-Stake, a handicap is imposed by forcing participants to expend money. In both cases, the goal is to authenticate the results of consensus by auctioning control of the blockchain to the highest bidder. If voting power is assigned in proportion to verifiable handicaps, then a virtual network can be created on which the fraction of faulty replicas is guaranteed to fall below the security threshold of a specified consensus algorithm. This approach predates Bitcoin by four years. It has been the foundation of permissionless Byzantine consensus since the publication of 2 the first Sybil-resistant algorithms. [5] Unfortunately, it is deeply flawed. The present paper identifies three critical problems. A Confluence of Errors First, the traditional handicap principle is no longer good science. It reflects the state of biological signaling theory in the early 1990s. [6] Subsequent research, some of it by the same scientist who first formalized the handicap principle, has refuted the theory that a signal’s reliability depends on its verifiable cost to the signaler. In reality, honest signals can be transmitted at zero cost, as long as the verifiable cost of a dishonest signal is sufficiently high. [7] There is thus no intrinsic reason that participants in permissionless consensus should be forced to waste money or computing power. The costs that Proof-of-Work and Proof-of-Stake systems impose on their users are a design flaw, not a feature. Second, the assumption that Sybil-resistance is sufficient to guarantee consensus on a permissionless network is false. It conflates two distinct forms of statistical bias: sampling error and non-sampling error. The former relates to which elements of a population are included in a sample, the latter to how those elements are counted. [8] Sybil-resistance guarantees that the entities participating in consensus will be counted correctly, but it does not guarantee that those entities are an unbiased sample of the population that is axiomatically known to contain an honest majority or supermajority. The maximum fraction of corrupted entities within the protocol can be verified only if the set of protocol participants is large enough to ensure an accurate sample of the population whose minimum percentage of honest members is axiomatically known. Today’s Sybil-resistant protocols do not satisfy the minimum threshold for statistical reliability, so they are vulnerable to “book prize” attacks, which exploit their reliance on non-probability sampling. Third, axioms about control of a designated resource are incompatible with the concept of an adaptive adversary. Economic agents have varying resource endowments, so an adaptive adversary can alter how much of a given resource it controls simply by switching which agents it has corrupted. To establish the security of a resource-weighted protocol, it is therefore necessary to start with an axiom that is invariant in the face of adaptive corruption — such as the maximum combined value of all resources within the adversary’s potential control — and to prove from that axiom that the adversary will be unable to acquire the fraction of the designated resource needed for an attack. Existing proofs of security for permissionless blockchains are tautological, because they start with this desired conclusion as their premise. These three problems are related. Because computer scientists have mistakenly assumed that it is necessary to employ handicap-authenticated signaling, they have designed protocols that force all participants to waste 3 money or computing power. Because today’s protocols force all participants to waste money or computing power, most internet users decline to join, so the set of participating agents is not large enough to ensure a statistically unbiased sample. Because the set of protocol participants is not a reliable sample of the population, it is impossible to prove the existence of an honest majority within the protocol, so one has simply been assumed as an axiom. It should be a red flag when the most rigorous proofs in a field all start with the same convenient-but-unreliable axiom. [9] The ultimate purpose of formal proofs of security is to provide information to end users about the real-world reliability of a given protocol. If the adversary model employed in a formal proof does not match the threats that are present in the real world, “the presented protocols may not be fully proven by the formalization.” [10] A facially rigorous proof based on a flawed adversary model may be worse than no proof at all, because the illusion of security it provides can induce end users to leave themselves vulnerable to attack. If society is going to employ permissionless ledgers to manage billions of dollars in value, the security of those systems should be derived from axioms that are known to be true with overwhelming probability. If proof of security cannot be obtained from reliable premises, then the public should be warned to treat permissionless ledgers as high-risk toys, not serious financial platforms for storing and exchanging value. Illusion of Security This pessimism may sound inconsistent with the track record of today’s permissionless blockchains. It is not. Past results provide no credible assurances of future safety in this context, because — according to game-theoretic modeling — a rational adversary will delay its attack to inculcate a false sense of security among protocol participants, then “cash out” by executing a double-spending attack once a sufficiently large payoff is available. As Ponzi schemes famously illustrate,
Recommended publications
  • Nonstandard Set Theories and Information Management
    Nonstandard Set Theories and Information Management VAROL AKMAN akmantroycsbilkentedutr Department of Computer Engineering and Information Science Bilkent University Bilkent Ankara Turkey MUJDATPAKKAN pakkantrb ounbitnet Department of Computer Engineering Bosphorus University Bebek Istanbul Turkey Abstract The merits of set theory as a foundational to ol in mathematics stimulate its use in various areas of articial intelligence in particular intelligent information systems In this pap er a study of various nonstandard treatments of set theory from this p ersp ective is oered Applications of these alternative set theories to information or knowledge management are surveyed Keywords set theory knowledge representation information management commonsense rea soning nonwellfounded sets hyp ersets Intro duction Set theory is a branch of mo dern mathematics with a unique place b ecause various other branches can b e formally dened within it For example Book of the inuential works of N Bourbaki is devoted to the theory of sets and provides the framework for the remaining 1 volumes Bourbaki said in Goldblatt All mathematical theories may b e regarded as extensions of the general theory of sets On these foundations I can state that I can build up the whole of the mathematics of the present day This brings up the p ossibility of using set theory in foundational studies in AI Mc Carthy has emphasized the need for fundamental research in AI and claimed that AI needs mathematical and logical theory involving conceptual innovations In an
    [Show full text]
  • New Foundations of Cost–Benefit Analysis
    Regulation & Governance (2009) 3, 72–83 doi:10.1111/j.1748-5991.2009.01045.x RESEARCH FORUM New foundations of cost–benefit analysis A reply to Professors Sinden, Kysar, and Driesen Matthew Adler University of Pennsylvania Law School, Philadelphia, PA, USA Eric A. Posner University of Chicago Law School, Chicago, IL, USA Abstract This article responds to the criticisms of New Foundations of Cost–Benefit Analysis that appeared in a review by Amy Sinden, Douglas A. Kysar, and David M. Driesen. We argue that their criticisms are either based on misunderstandings of our approach or are too demanding, in the sense that no reasonable decision procedure would satisfy them. We illustrate this second argument by demon- strating that their preferred approach – feasibility analysis – has little to recommend it. Keywords: cost–benefit analysis, regulation, welfarism. Introduction The basic thrust of New Foundations of Cost–Benefit Analysis is to detach cost–benefit analysis (CBA) from Kaldor–Hicks efficiency,which we argue lacks moral relevance (Adler & Posner 2006, pp. 19–24), and instead to see CBA as a rough, administrable proxy for overall wellbeing. We defend the moral relevance of overall wellbeing by appealing, not to utilitarianism,but to“weak welfarism”– a much more catholic moral view that sees overall wellbeing as one among a possible plurality of foundational moral criteria, along with distributive considerations, deontological rights, and non-welfare values (Adler & Posner 2006, pp. 52–61). We also reject the simple equation of wellbeing and preference- satisfaction. An outcome benefits a person only if she prefers it and her preferences are self-interested and survive idealization (Adler & Posner 2006, pp.
    [Show full text]
  • On the Brink of New Promise the FUTURE of U.S
    On the Brink of New Promise THE FUTURE OF U.S. COMMUNITY FOUNDATIONS By Lucy Bernholz, Katherine Fulton, and Gabriel Kasper Blueprint Research & Design, Inc. and the Monitor Institute, a member of Monitor Group Created with funding support from the Charles Stewart Mott Foundation and the Ford Foundation © Copyright 2005 Blueprint Research & Design, Inc. and Monitor Company Group, LLP. We encourage readers to use and share the contents of this report, with the understanding that it is the intellectual property of Blueprint Research & Design and the Monitor Group, and that full attribution is required. An invitation from the Charles Stewart Mott and Ford foundations Our two foundations have been privileged to have worked with community foundations across the United States over the past quarter century. For both of us, this is an area of philanthropy we care about deeply. Th e Charles Stewart Mott and Ford foundations work at national and indeed international levels. Yet we both know how valuable it is for large private foundations to have strong, dynamic community founda- tions as partners working at the local level. Th is is why we embraced the opportunity to look deep into the future of the fi eld through the expert eyes of Lucy Bernholz, Katherine Fulton, and Gabriel Kasper. Over those years, Mott and Ford have made grants to community foundations to build general and administrative endowments, to provide peer-to-peer learning opportunities and technical assistance, and to strengthen programming expertise in areas such as low-income neighborhoods, the environment, and minority communities. We have also assisted the fi eld to develop abroad, and we have helped build the infrastructure for community foundations We recommend this report to community foundation staff, boards of directors, across the nation.
    [Show full text]
  • The Problem of the Consistency of “New Foundations”
    The Problem of the Consistency of \New Foundations" Randall Holmes September 10, 2013 1 Abstract: I will explain the nature of the long- standing problem of the consistency of the set theory New Foundations proposed by the philoso- pher W. v. O. Quine in 1937 both in the prior context of the development of set theory, its usefulness in mathematics, and the problem of the "paradoxes" of set theory, and in the pos- terior context of partial solutions to the consis- tency problem and related results. I do claim to have solved this problem (this is not generally agreed yet) but I am not going to talk about that on this occasion. The talk should be ac- cessible to a general audience of mathemati- cians; I hope that a graduate student or mature undergraduate would get something out of it too. Plan of the talk I'm going to talk about the problem of the consistency of Quine's set theory \New Foun- dations", which is the central issue of my tiny area of set theory. I currently believe that I have solved this prob- lem, but this has nothing to do with the present talk, or very little. What I propose to do is explain what the prob- lem is and put it in some kind of context. Why does one need a set theory? Why is there a problem of consistency of set theories? What is New Foundations anyway and why is there a problem with it in particular? What are the relevant related results and partial solutions to the problem? 2 What is a set? When I tell a layman with no maths that I work in set theory, they ask \What is a set?".
    [Show full text]
  • By WV QUINE Only the Truth Functions, Quantification, and Membership
    538 MATHEMATICS: W. V. QUINE PROC. N. A. S. ON THE CONSISTENC Y OF "NEW FOUNDA TIONS" By W. V. QUINE HARVARD UNIVERSITY Communicated by Hassler Whitney, June 12, 1951 The question of the consistency of the system of set theory in my "New Foundations,"' briefly NF, has long been mooted.2 3 The question takes on added interest with Wang's new revision4 of the system of my Mathe- matical Logic;5 for Wang has proved that this revision, which is to be used in the forthcoming third printing of Mathematical Logic, is consistent if NF is consistent. The purpose of the present note is to point out that NF is consistent if and only if the system of Whitehead and Russell's Principia Mathematica,6 briefly PM, is consistent in a certain one of its natural ver- sions. PM may be conceived in a form in which the variables carry indices indi- cating logical type. Alternatively PM may be conceived under a conven- tion which Whitehead and Russell called typical ambiguity; under this con- vention the indices are suppressed but still only those formulae are ad- mitted as meaningful which are stratified, i.e., so constituted that type in- dices could be inserted everywhere without violating the theory of types. Another point on which we have a choice, in framing a version of PM, is the status of relations and relational predication: we may assume relational predication as primitive on a par with set-membership, and decree a sprawling annex to the theory of types to accommodate relations, as was done in effect by Whitehead and Russell; or alternatively we may, follow- ing Wiener,7 reduce relations to certain sets of sets of sets (at the cost of giving up relations between things of different types).
    [Show full text]
  • New Foundations of Transnational Private Regulation Author(S): Fabrizio Cafaggi Source: Journal of Law and Society, Vol
    Cardiff University New Foundations of Transnational Private Regulation Author(s): Fabrizio Cafaggi Source: Journal of Law and Society, Vol. 38, No. 1, The Challenge of Transnational Private Regulation: Conceptual and Constitutional Debates (MARCH 2011), pp. 20-49 Published by: Wiley on behalf of Cardiff University Stable URL: https://www.jstor.org/stable/23030395 Accessed: 21-01-2019 16:18 UTC JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at https://about.jstor.org/terms Cardiff University, Wiley are collaborating with JSTOR to digitize, preserve and extend access to Journal of Law and Society This content downloaded from 154.59.124.94 on Mon, 21 Jan 2019 16:18:47 UTC All use subject to https://about.jstor.org/terms JOURNAL OF LAW AND SOCIETY VOLUME 38, NUMBER 1, MARCH 2011 ISSN: 0263-323X, pp. 20-49 New Foundations of Transnational Private Regulation Fabrizio Cafaggi* In section I oj this article, the factors driving towards the emergence of new transnational private regulation (TPR) are identified in com parison with, on the one hand, merchant law and, on the other, inter national public regimes. In section II, the focus is on the private sphere, looking at both the different conflicts of interests arising in the regulatory relationships and the need for governance responses.
    [Show full text]
  • New Foundations for Physical Geometry
    New Foundations for Physical Geometry Tim Maudlin (First Draft: Please do not cite) Socrates: I do not insist that my argument is right in all other respects, but I would contend at all costs in both word and deed as far as I could that we will be better men, braver and less idle, if we believe that one must search for the things one does not know, rather than if we believe that it is not possible to find out what we do not know and that we must not look for it. Plato, Meno 86c 2 NEW FOUNDATIONS: INTRODUCTION 3 Introduction The thesis of this book is both simple and audacious. It is so simple that the basic claims can be boiled down into two sentences. First: the most fundamental geometrical structure that organizes physical points into a space is the line. Second: what endows space‐time with its geometry is time. The remainder of this volume does nothing but elucidate these two sentences. Everything flows from them in such a straightforward way that I am almost convinced that the reader could close the book forthwith and, with sufficient patience and diligence, reconstruct most of the content from these two propositions. As for the audacity, acceptance of either of these propositions demands the rejection of widely held and deeply entrenched alternatives. Consider a collection of objects that we wish to regard as forming not merely a set (which it does automatically) but as forming a space. Organizing the set into a space requires something more than the set‐theoretic structure.
    [Show full text]
  • New Foundations for Information Theory: Probability Theory Information Theory Subset Logic = Partition Logic
    New Foundations for Information Theory: Probability Theory Information Theory Subset Logic = Partition Logic David Ellerman University of California-Riverside University of Ljubljana, Slovenia Probability Theory = Information Theory David EllermanUniversityNew Foundations of California-RiversideUniversity for Information Theory: Subset of Ljubljana, Logic SloveniaPartition () Logic 1 / 34 Duality of Subsets and Partitions:I “The dual notion (obtained by reversing the arrows) of ‘part’ is the notion of partition.” [Lawvere]; mono S X dualizes to epi X Y. ! ! Duality of Elements and Distinctions ("Its" & "Dits") Partition p = B1, ..., B6 on set U = u1, ..., un . f –g f g Probability Theory = Information Theory David EllermanUniversityNew Foundations of California-RiversideUniversity for Information Theory: Subset of Ljubljana, Logic SloveniaPartition () Logic 2 / 34 Duality of Subsets and Partitions:II Dual Logics: Boolean subset logic of subsets and partition logic Probability Theory = Information Theory David EllermanUniversityNew Foundations of California-RiversideUniversity for Information Theory: Subset of Ljubljana, Logic SloveniaPartition () Logic 3 / 34 Duality of Subsets and Partitions: III Published papers on partition logic: The Logic of Partitions: Introduction to the Dual of the Logic of Subsets. Review of Symbolic Logic, 3(2 June), 287–350, 2010. An introduction of partition logic. Logic Journal of the IGPL, 22(1), 94–125, 2014. Birkhoff & von Neumann created quantum logic by linearizing logic of subsets to logic of subspaces of a vector space. Hence the dual form of quantum logic created by linearing logic of partitions to logic of direct-sum decompositions of a vector space: The Quantum Logic of Direct-Sum-Decompositions: The Dual to the Quantum Logic of Subspaces. Logic Journal of the IGPL.
    [Show full text]
  • Quine's “New Foundations” Is Consistent
    Quine's \New Foundations" is consistent M. Randall Holmes Mathematics Department Boise State University 1910 University Drive Boise ID 83705 U.S.A. [email protected] Abstract In this paper we will present a proof of the consistency of Quine's set theory \New Foundations" (hereinafter NF), so-called after the title of the 1937 paper [11] of Quine in which it was introduced. The strategy is to present a Fraenkel-Mostowski construction of a model of an extension of Zermelo set theory without choice whose consistency was shown to entail consistency of NF in our paper [5] of 1995. There is no need to refer to [5]: this paper presents a full (we think a better) account of considerations drawn from that paper. keywords: New Foundations, Fraenkel-Mostowski construction AMS codes: 03E70 This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. 1 version of 9:30 pm 3/7/2017, simplified main construction. 2 Contents 1 The simple theory of types 3 2 The definition of New Foundations 6 3 Well-known results about New Foundations 8 4 Consistency of NFU 10 5 Tangled type theories 13 5.1 !- and α-models from tangled type theory . 15 6 Tangled webs of cardinals 17 7 The main construction 19 8 Conclusions and questions 49 9 References and Index 51 version of 9:30 pm 3/7/2017, simplified main construction. 3 1 The simple theory of types New Foundations is introduced as a modification of a simple typed theory of sets which we will call \the simple theory of types" and abbreviate TST (following the usage of Thomas Forster and others).
    [Show full text]
  • A New Foundational Crisis in Mathematics, Is It Really Happening? Mirna Džamonja
    A new foundational crisis in mathematics, is it really happening? Mirna Džamonja To cite this version: Mirna Džamonja. A new foundational crisis in mathematics, is it really happening?. Reflections on the Foundations of Mathematics - Univalent Foundations, Set Theory and General Thoughts, In press. hal-01858065 HAL Id: hal-01858065 https://hal.archives-ouvertes.fr/hal-01858065 Submitted on 19 Aug 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. A new foundational crisis in mathematics, is it really happening? Mirna Dzamonjaˇ Abstract The article reconsiders the position of the foundations of mathematics after the discovery of HoTT. Discussion that this discovery has generated in the community of mathematicians, philosophers and computer scientists might indicate a new crisis in the foundation of mathematics. By examining the mathematical facts behind HoTT and their relation with the existing foundations, we conclude that the present crisis is not one. We reiterate a pluralist vision of the foundations of mathematics. The article contains a short survey of the mathematical and historical background needed to understand the main tenets of the foundational issues. 1 Introduction It seems by all evidence about the differences between set theory and homotopy type theory HoTT that there is a crisis in the foundations of mathematics.
    [Show full text]
  • Maximize “Returns” on Foundation Investment
    Draft: Do not cite without permission from the author. In early 2008, the New York Times Magazine engaged individuals with different viewpoints on education philanthropy to discuss the sector’s changes. Steve Barr, founder and then-CEO of a well-known Los Angeles-based charter school operator, Green Dot Public Schools, explained why his organization received large amounts of money from philanthropist Eli Broad: “Because I’m a disruptive force. And he’s betting on that force gaining enough momentum that it will ultimately change the system, not just in L.A. but elsewhere, too, in a way that really realigns the education debate.”1 This type of grantmaking is commonly referred to as venture philanthropy. Like venture capital, venture philanthropy seeks to maximize “returns” on foundation investment. Unlike venture capital, where returns are often measured in profit, philanthropic returns come through social change. Although this concept is not new, even to education philanthropy, its scope as undertaken by newly emergent education foundations like the Gates, Broad, and Walton Foundations is a relatively recent phenomenon.2 The largest new national foundations are characterized by living benefactors who built their business fortunes in the late 20th century economy. Among the new foundations included in this chapter’s analyses, Bill Gates and Michael Dell built their fortune through technology, Eli Broad and Julian Robertson through investment, and the Walton family through multi-national retail. These magnates typically established their eponymous foundations in the 1990s, and their K–12 education philanthropy became most prodigious after 2000. Older foundations usually trace their history back much farther, and were typically established between 1900 and 1950.3 Another key difference lies in the types of organizations they fund.
    [Show full text]
  • The Axiom for Connected Sets
    442 R. L. STANLEY [June sists of the set of all ra such that EdneT. The Gödel number of Ed is, of course, d and the Gödel number of ra (i.e. of a string of l's of length w) is 2" —1. Thus A consists of all ra such that d * (2n— 1)G7V We add to (S2) : Axiom. PI —11. Production. Px —y—>Pxl —yy. P represents the set of all ordered pairs (i, j) such that 2'=j. Then we add: Production. Px —yl, Cd —y —z, Toz—^Qx. In this system (S), "Q" represents A. References 1. E. Post, Formal reductions of the general combinatorial decision problem, Amer. J. Math. vol. 65 (1943) pp. 192-215. 2. P. C. Rosenbloom, The elements of mathematical logic, New York, Dover Publications, Inc., 1950, Chapter IV. 3. R. M. Smullyan, Theory of formal systems, Annals of Mathematics Studies, no. 47, Princeton University Press, 1961. Princeton University THE AXIOM FOR CONNECTED SETS R. L. STANLEY 1. Introduction. The basic motivation for this study was a desire to find a genuinely unified postulational principle which incorporated both the Axiom of Choice and the "axiom for sets," which latter means an appropriate analogue of the Aussonderungsaxiom to provide for the existence of sets. The possibility of thus uniting these two axiomatic principles has become especially interesting since adding the Axiom of Choice has been shown to be not only safe,1 but neces- sary as well.2 In particular, it was further hoped and expected that such a principle, when found, could be expressed naturally as a mem- bership-equivalence statement—that is, essentially of the form Presented to the Society, June 20, 1959 under the title The postulate for connected sets; received by the editors April 11, 1960 and, in revised form, May 31, 1960.
    [Show full text]