Tuple Routing Strategies for Distributed Eddies

Total Page:16

File Type:pdf, Size:1020Kb

Tuple Routing Strategies for Distributed Eddies Tuple Routing Strategies for Distributed Eddies Feng Tian David J. DeWitt Department of Computer Sciences University of Wisconsin, Madison Madison, WI, 53706 {ftian, dewitt}@cs.wisc.edu Abstract data stream management systems has begun to receive the attention of the database community. Many applications that consist of streams of data Many of the fundamental assumptions that are the are inherently distributed. Since input stream basis of standard database systems no longer hold for data rates and other system parameters such as the stream management systems [8]. A typical stream query is amount of available computing resources can long running -- it listens on several continuous streams fluctuate significantly, a stream query plan must and produces a continuous stream as its result. The notion be able to adapt to these changes. Routing tuples of running time, which is used as an optimization goal by between operators of a distributed stream query a classic database optimizer, cannot be directly applied to plan is used in several data stream management a stream management system. A data stream management systems as an adaptive query optimization system must use other performance metrics. In addition, technique. The routing policy used can have a since the input stream rates and the available computing significant impact on system performance. In this resources will usually fluctuate over time, an execution paper, we use a queuing network to model a plan that works well at query installation time might be distributed stream query plan and define very inefficient just a short time later. Furthermore, the performance metrics for response time and “optimize-then-execute” paradigm of traditional database system throughput. We also propose and systems is no longer appropriate and a stream execution evaluate several practical routing policies for a plan must be able to adapt to changes of input streams and distributed stream management system. The system resources. performance results of these policies are An eddy [2] is a stream query execution mechanism compared using a discrete event simulator. that can continuously reorder operators in a query plan. Finally, we study the impact of the routing policy Each input tuple to an eddy carries its own execution on system throughput and resource allocation history. This execution history is implemented using two when computing resources can be shared bitmaps. A done bitmap records which operators the tuple between operators. has already visited and a ready bitmap records which operators the tuple can visit next. An eddy routes each 1. Introduction tuple to the next operator based on the tuple’s execution history and statistics maintained by eddy. If the tuple Stream database systems are a new type of database satisfies the predicate of an operator, the operator makes system designed to facilitate the execution of queries appropriate updates to the two bitmaps and returns the against continuous streams of data. Example applications tuple to the eddy. The eddy continues this iteration until for such systems include sensor networks, network the tuple has visited all operators. Figure 1.1 shows an monitoring applications, and online information tracking. eddy with three operators. The major advantage of an Since many stream-based applications are inherently eddy is that the execution plan is highly adaptive with the distributed, a centralized solution is not viable. Recently routing decision for each individual tuple deciding the the design and implementation of scalable, distributed execution order of the operators for this tuple. [2][18] demonstrate that this technique adapts well to changes in Permission to copy without fee all or part of this material is granted input stream rates. provided that the copies are not made or distributed for direct However, a centralized eddy cannot be directly commercial advantage, the VLDB copyright notice and the title of the publication and its date appear, and notice is given that copying is by employed in a distributed data stream management system permission of the Very Large Data Base Endowment. To copy without incurring unnecessary network traffic and delays otherwise, or to republish, requires a fee and/or special permission from and would almost certainly end up being a bottleneck. the Endowment Proceedings of the 29th VLDB Conference, Berlin, Germany, 2003 2. Related Work Op2 Op1 Op3 There are a number of research projects currently Op2 studying issues related to streaming data [1][2][3][4][5] [6][7][8][12][16][18][22][26]. Those that are most closely Eddy Op1 Op3 related to our work are the Aurora* [6][8], STREAM [3][4][22], Telegraph [2][9][18] and Cougar [7][12] Figure 1.2 A distributed projects. Figure 1.1 Centralized The original eddy paper [2] introduced the concept of Eddy query plan routing tuples between operators as a form of query In this paper we study the design, implementation, and optimization. This paper extends the idea of an eddy to a performance of the following distributed eddy algorithm. distributed environment. The routing policies described in After an operator processes a tuple, instead of returning [2] and [18] are compared against several other routing the tuple to a centralized eddy, the operator makes a policies in Section 4 and 5. routing decision based on the execution history of the Aurora [8] describes the architecture of a data stream tuple and statistics maintained at the operator. Figure 1.2 management system. Aurora* [6] extends Aurora to a shows a distributed plan with three operators. The dashed distributed environment and discusses load sharing arrows indicate possible routes between operators. The techniques. Aurora also uses routing as a mechanism to four solid arrows indicate one possible execution order reorder operators. The routing mechanism is similar to that a tuple might actually take. The routing policy at each that of an eddy and our results can be adapted to Aurora*. operator decides the execution order of the operators for STREAM [3] describes a query language and precise each tuple, therefore, dynamically optimizing the semantics of stream queries. [5][22] describe both distributed stream query plan. The purpose of this paper is operator scheduling and resource management in a to study the effectiveness of different routing policies. centralized data stream management system, focusing on As discussed earlier, query response time is not an minimizing inter-operator queue length or memory appropriate metric to evaluate a data stream management consumption. In [22] a near-optimal scheduling algorithm system. Instead we propose the following two metrics: for reducing inter-operator queue size is presented. In ART - the average response time measured as the addition, [22] explores using constraints to optimize average time between when a tuple enters and leaves the stream query plans. operators that form the distributed eddy. Cougar [7][12] is a distributed sensor database system. Cougar focuses on forming clusters out of sensors to MDR - the maximum data rate the system can handle allow intelligent in-network aggregation to conserve before an operator becomes a bottleneck. energy by reducing the amount of communication The formal description of the system and rigorous between sensor nodes. definitions of these metrics will be given in Section 3. [27] asserts that execution time is not an appropriate Section 4 examines the impact of the routing policy on goal for optimizing stream queries and proposes the use of system performance. The distributed query plan is output rates as more appropriate. The output rate metric modelled using a queuing network and a solution proposed in [27] is essentially equivalent to our MDR. technique is described. We also study several practical Several approaches have been proposed on how to routing policies that have straightforward gather statistics over a stream [4] [11] [13] [16] [19] [20] implementations and compare their performance. [21] with the primary focus being how to obtain good A distributed stream processing system must be able estimates over streaming data with limited amounts of to dynamically adapt to configuration changes such as memory and minimal CPU usage. These results will be adding or removing computing resources. Changes in critical to the design of accurate routing policies to any input data rates may also require the system to re-allocate distributed eddy implementation. resources via load sharing techniques. Aurora* [6] There are many papers that describe the use of implements box sliding and box splitting to enable load queuing networks to analyze computer system. [14][15] sharing across nodes. The natural way of applying these are the standard texts on this subject. load sharing techniques is to split the workload of an overloaded node and to merge workloads of lightly loaded 3. Overview of the System Model and nodes. The routing policy is an important factor in determining which node is likely to be overloaded. In Performance Metrics Section 5, the effect of routing policy on the system We model a distributed stream query plan as a set of throughput and resource allocation when computing operators Opi, i=1,..,n connected by a network. Input resources can be added to or removed from a node is tuples to an operator are added to a first-come, first- examined. Conclusions and future research directions are served (FCFS) queue, as shown in Figure 3.1. Opi.R contained in Section 6. resources (i.e. CPU, memory and network bandwidth) are assumed to be available to each operator Opi,. We further Operators in this model have only one input queue. assume that each input tuple to Opi consumes, on average, We briefly explain how to implement the join operator, Opi.r, resources. Thus, Opi can process at most which logically has two input streams. Our treatment of Opi.R/Opi.r input tuples per time unit and the average join is very much like a distributed version of SteMs service time Ts for each individual tuple is Opi.r/Opi.R.
Recommended publications
  • Set Theory, by Thomas Jech, Academic Press, New York, 1978, Xii + 621 Pp., '$53.00
    BOOK REVIEWS 775 BULLETIN (New Series) OF THE AMERICAN MATHEMATICAL SOCIETY Volume 3, Number 1, July 1980 © 1980 American Mathematical Society 0002-9904/80/0000-0 319/$01.75 Set theory, by Thomas Jech, Academic Press, New York, 1978, xii + 621 pp., '$53.00. "General set theory is pretty trivial stuff really" (Halmos; see [H, p. vi]). At least, with the hindsight afforded by Cantor, Zermelo, and others, it is pretty trivial to do the following. First, write down a list of axioms about sets and membership, enunciating some "obviously true" set-theoretic principles; the most popular Hst today is called ZFC (the Zermelo-Fraenkel axioms with the axiom of Choice). Next, explain how, from ZFC, one may derive all of conventional mathematics, including the general theory of transfinite cardi­ nals and ordinals. This "trivial" part of set theory is well covered in standard texts, such as [E] or [H]. Jech's book is an introduction to the "nontrivial" part. Now, nontrivial set theory may be roughly divided into two general areas. The first area, classical set theory, is a direct outgrowth of Cantor's work. Cantor set down the basic properties of cardinal numbers. In particular, he showed that if K is a cardinal number, then 2", or exp(/c), is a cardinal strictly larger than K (if A is a set of size K, 2* is the cardinality of the family of all subsets of A). Now starting with a cardinal K, we may form larger cardinals exp(ic), exp2(ic) = exp(exp(fc)), exp3(ic) = exp(exp2(ic)), and in fact this may be continued through the transfinite to form expa(»c) for every ordinal number a.
    [Show full text]
  • A General Setting for the Pointwise Investigation of Determinacy
    A General Setting for the Pointwise Investigation of Determinacy Yurii Khomskii? Institute of Logic, Language and Computation University of Amsterdam Plantage Muidergracht 24 1018 TV Amsterdam The Netherlands Abstract. It is well-known that if we assume a large class of sets of reals to be determined then we may conclude that all sets in this class have certain regularity properties: we say that determinacy implies regularity properties classwise. In [L¨o05]the pointwise relation between determi- nacy and certain regularity properties (namely the Marczewski-Burstin algebra of arboreal forcing notions and a corresponding weak version) was examined. An open question was how this result extends to topological forcing no- tions whose natural measurability algebra is the class of sets having the Baire property. We study the relationship between the two cases, and using a definition which adequately generalizes both the Marczewski- Burstin algebra of measurability and the Baire property, prove results similar to [L¨o05]. We also show how this can be further generalized for the purpose of comparing algebras of measurability of various forcing notions. 1 Introduction The classical theorems due to Mycielski-Swierczkowski, Banach-Mazur and Mor- ton Davis respectively state that under the Axiom of Determinacy all sets of reals are Lebesgue measurable, have the Baire property and the perfect set property (see, e.g., [Ka94, pp 373{377]). In fact, these proofs give classwise implications, i.e., if Γ is a boldface pointclass (closed under continuous preimages and inter- sections with basic open sets) such that all sets in Γ are determined, then all sets in Γ have the corresponding regularity property.
    [Show full text]
  • Equivalents to the Axiom of Choice and Their Uses A
    EQUIVALENTS TO THE AXIOM OF CHOICE AND THEIR USES A Thesis Presented to The Faculty of the Department of Mathematics California State University, Los Angeles In Partial Fulfillment of the Requirements for the Degree Master of Science in Mathematics By James Szufu Yang c 2015 James Szufu Yang ALL RIGHTS RESERVED ii The thesis of James Szufu Yang is approved. Mike Krebs, Ph.D. Kristin Webster, Ph.D. Michael Hoffman, Ph.D., Committee Chair Grant Fraser, Ph.D., Department Chair California State University, Los Angeles June 2015 iii ABSTRACT Equivalents to the Axiom of Choice and Their Uses By James Szufu Yang In set theory, the Axiom of Choice (AC) was formulated in 1904 by Ernst Zermelo. It is an addition to the older Zermelo-Fraenkel (ZF) set theory. We call it Zermelo-Fraenkel set theory with the Axiom of Choice and abbreviate it as ZFC. This paper starts with an introduction to the foundations of ZFC set the- ory, which includes the Zermelo-Fraenkel axioms, partially ordered sets (posets), the Cartesian product, the Axiom of Choice, and their related proofs. It then intro- duces several equivalent forms of the Axiom of Choice and proves that they are all equivalent. In the end, equivalents to the Axiom of Choice are used to prove a few fundamental theorems in set theory, linear analysis, and abstract algebra. This paper is concluded by a brief review of the work in it, followed by a few points of interest for further study in mathematics and/or set theory. iv ACKNOWLEDGMENTS Between the two department requirements to complete a master's degree in mathematics − the comprehensive exams and a thesis, I really wanted to experience doing a research and writing a serious academic paper.
    [Show full text]
  • Forcing in Proof Theory∗
    Forcing in proof theory¤ Jeremy Avigad November 3, 2004 Abstract Paul Cohen's method of forcing, together with Saul Kripke's related semantics for modal and intuitionistic logic, has had profound e®ects on a number of branches of mathematical logic, from set theory and model theory to constructive and categorical logic. Here, I argue that forcing also has a place in traditional Hilbert-style proof theory, where the goal is to formalize portions of ordinary mathematics in restricted axiomatic theories, and study those theories in constructive or syntactic terms. I will discuss the aspects of forcing that are useful in this respect, and some sample applications. The latter include ways of obtaining conservation re- sults for classical and intuitionistic theories, interpreting classical theories in constructive ones, and constructivizing model-theoretic arguments. 1 Introduction In 1963, Paul Cohen introduced the method of forcing to prove the indepen- dence of both the axiom of choice and the continuum hypothesis from Zermelo- Fraenkel set theory. It was not long before Saul Kripke noted a connection be- tween forcing and his semantics for modal and intuitionistic logic, which had, in turn, appeared in a series of papers between 1959 and 1965. By 1965, Scott and Solovay had rephrased Cohen's forcing construction in terms of Boolean-valued models, foreshadowing deeper algebraic connections between forcing, Kripke se- mantics, and Grothendieck's notion of a topos of sheaves. In particular, Lawvere and Tierney were soon able to recast Cohen's original independence proofs as sheaf constructions.1 It is safe to say that these developments have had a profound impact on most branches of mathematical logic.
    [Show full text]
  • FORCING and the INDEPENDENCE of the CONTINUUM HYPOTHESIS Contents 1. Preliminaries 2 1.1. Set Theory 2 1.2. Model Theory 3 2. Th
    FORCING AND THE INDEPENDENCE OF THE CONTINUUM HYPOTHESIS F. CURTIS MASON Abstract. The purpose of this article is to develop the method of forcing and explain how it can be used to produce independence results. We first remind the reader of some basic set theory and model theory, which will then allow us to develop the logical groundwork needed in order to ensure that forcing constructions can in fact provide proper independence results. Next, we develop the basics of forcing, in particular detailing the construction of generic extensions of models of ZFC and proving that those extensions are themselves models of ZFC. Finally, we use the forcing notions Cκ and Kα to prove that the Continuum Hypothesis is independent from ZFC. Contents 1. Preliminaries 2 1.1. Set Theory 2 1.2. Model Theory 3 2. The Logical Justification for Forcing 4 2.1. The Reflection Principle 4 2.2. The Mostowski Collapsing Theorem and Countable Transitive Models 7 2.3. Basic Absoluteness Results 8 3. The Logical Structure of the Argument 9 4. Forcing Notions 9 5. Generic Extensions 12 6. The Forcing Relation 14 7. The Independence of the Continuum Hypothesis 18 Acknowledgements 23 References 23 The Continuum Hypothesis (CH) is the assertion that there are no cardinalities strictly in between that of the natural numbers and that of the reals, or more for- @0 mally, 2 = @1. In 1940, Kurt G¨odelshowed that both the Axiom of Choice and the Continuum Hypothesis are relatively consistent with the axioms of ZF ; he did this by constructing a so-called inner model L of the universe of sets V such that (L; 2) is a (class-sized) model of ZFC + CH.
    [Show full text]
  • Higher Randomness and Forcing with Closed Sets
    Higher randomness and forcing with closed sets Benoit Monin Université Paris Diderot, LIAFA, Paris, France [email protected] Abstract 1 Kechris showed in [8] that there exists a largest Π1 set of measure 0. An explicit construction of 1 this largest Π1 nullset has later been given in [6]. Due to its universal nature, it was conjectured by many that this nullset has a high Borel rank (the question is explicitely mentioned in [3] and 0 [16]). In this paper, we refute this conjecture and show that this nullset is merely Σ3. Together 0 with a result of Liang Yu, our result also implies that the exact Borel complexity of this set is Σ3. To do this proof, we develop the machinery of effective randomness and effective Solovay gen- ericity, investigating the connections between those notions and effective domination properties. 1998 ACM Subject Classification F.4.1 Mathematical Logic Keywords and phrases Effective descriptive set theory, Higher computability, Effective random- ness, Genericity Digital Object Identifier 10.4230/LIPIcs.STACS.2014.566 1 Introduction We will study in this paper the notion of forcing with closed sets of positive measure and several variants of it. This forcing is generally attributed to Solovay, who used it in [15] to produce a model of ZF +DC in which all sets of reals are Lebesgue measurable. Stronger and stronger genericity for this forcing coincides with stronger and stronger notions of randomness. It is actually possible to express most of the randomness definitions that have been made over the years by forcing over closed sets of positive measure.
    [Show full text]
  • Forcing? Thomas Jech
    WHAT IS... ? Forcing? Thomas Jech What is forcing? Forcing is a remarkably powerful case that there exists no proof of the conjecture technique for the construction of models of set and no proof of its negation. theory. It was invented in 1963 by Paul Cohen1, To make this vague discussion more precise we who used it to prove the independence of the will first elaborate on the concepts of theorem and Continuum Hypothesis. He constructed a model proof. of set theory in which the Continuum Hypothesis What are theorems and proofs? It is a use- (CH) fails, thus showing that CH is not provable ful fact that every mathematical statement can from the axioms of set theory. be expressed in the language of set theory. All What is the Continuum Hypothesis? In 1873 mathematical objects can be regarded as sets, and Georg Cantor proved that the continuum is un- relations between them can be reduced to expres- countable: that there exists no mapping of the set sions that use only the relation ∈. It is not essential N of all integers onto the set R of all real numbers. how it is done, but it can be done: For instance, Since R contains N, we have 2ℵ0 > ℵ , where 2ℵ0 0 integers are certain finite sets, rational numbers and ℵ are the cardinalities of R and N, respec- 0 are pairs of integers, real numbers are identified tively. A question arises whether 2ℵ0 is equal to with Dedekind cuts in the rationals, functions the cardinal ℵ1, the immediate successor of ℵ0.
    [Show full text]
  • Determinacy from Strong Reflection
    TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 366, Number 8, August 2014, Pages 4443–4490 S 0002-9947(2013)06058-8 Article electronically published on May 28, 2013 DETERMINACY FROM STRONG REFLECTION JOHN STEEL AND STUART ZOBLE Abstract. The Axiom of Determinacy holds in the inner model L(R) assum- ing Martin’s Maximum for partial orderings of size c. 1. Introduction A theorem of Neeman gives a particularly elegant sufficient condition for a set B of reals to be determined: B is determined if there is a triple (M,τ,Σ) which captures B in the sense that M is a model of a sufficient fragment of set theory, τ is a forcing term in M with respect to the collapse of some Woodin cardinal δ of M to be countable, and Σ is an ω + 1-iteration strategy for M such that B ∩ N[g]=i(τ)g, whenever i : M → N is an iteration map by Σ, and g is generic over N for the collapse of i(δ). The core model induction, the subject of the forthcoming book [16], is a method pioneered by Woodin for constructing such triples (M,τ,Σ) by induction on the complexity of the set B. It seems to be the only generally applicable method for making fine consistency strength calculations above the level of one Woodin cardinal. We employ this method here to establish that the Axiom of Determinacy holds in the inner model L(R) from consequences of the maximal forcing axiom MM(c), or Martin’s Maximum for partial orderings of size c.
    [Show full text]
  • ARITHMETICAL SACKS FORCING 1. Introduction Two Fundamental
    ARITHMETICAL SACKS FORCING ROD DOWNEY AND LIANG YU Abstract. We answer a question of Jockusch by constructing a hyperimmune- free minimal degree below a 1-generic one. To do this we introduce a new forcing notion called arithmetical Sacks forcing. Some other applications are presented. 1. introduction Two fundamental construction techniques in set theory and computability theory are forcing with finite strings as conditions resulting in various forms of Cohen genericity, and forcing with perfect trees, resulting in various forms of minimality. Whilst these constructions are clearly incompatible, this paper was motivated by the general question of “How can minimality and (Cohen) genericity interact?”. Jockusch [5] showed that for n ≥ 2, no n-generic degree can bound a minimal degree, and Haught [4] extended earlier work of Chong and Jockusch to show that that every nonzero Turing degree below a 1-generic degree below 00 was itself 1- generic. Thus, it seemed that these forcing notions were so incompatible that perhaps no minimal degree could even be comparable with a 1-generic one. However, this conjecture was shown to fail independently by Chong and Downey [1] and by Kumabe [7]. In each of those papers, a minimal degree below m < 00 and a 1-generic a < 000 are constructed with m < a. The specific question motivating the present paper is one of Jockusch who asked whether a hyperimmune-free (minimal) degree could be below a 1-generic one. The point here is that the construction of a hyperimmune-free degree by and large directly uses forcing with perfect trees, and is a much more “pure” form of Spector- Sacks forcing [10] and [9].
    [Show full text]
  • SET THEORY Andrea K. Dieterly a Thesis Submitted to the Graduate
    SET THEORY Andrea K. Dieterly A Thesis Submitted to the Graduate College of Bowling Green State University in partial fulfillment of the requirements for the degree of MASTER OF ARTS August 2011 Committee: Warren Wm. McGovern, Advisor Juan Bes Rieuwert Blok i Abstract Warren Wm. McGovern, Advisor This manuscript was to show the equivalency of the Axiom of Choice, Zorn's Lemma and Zermelo's Well-Ordering Principle. Starting with a brief history of the development of set history, this work introduced the Axioms of Zermelo-Fraenkel, common applications of the axioms, and set theoretic descriptions of sets of numbers. The book, Introduction to Set Theory, by Karel Hrbacek and Thomas Jech was the primary resource with other sources providing additional background information. ii Acknowledgements I would like to thank Warren Wm. McGovern for his assistance and guidance while working and writing this thesis. I also want to thank Reiuwert Blok and Juan Bes for being on my committee. Thank you to Dan Shifflet and Nate Iverson for help with the typesetting program LATEX. A personal thank you to my husband, Don, for his love and support. iii Contents Contents . iii 1 Introduction 1 1.1 Naive Set Theory . 2 1.2 The Axiom of Choice . 4 1.3 Russell's Paradox . 5 2 Axioms of Zermelo-Fraenkel 7 2.1 First Order Logic . 7 2.2 The Axioms of Zermelo-Fraenkel . 8 2.3 The Recursive Theorem . 13 3 Development of Numbers 16 3.1 Natural Numbers and Integers . 16 3.2 Rational Numbers . 20 3.3 Real Numbers .
    [Show full text]
  • The Method of Forcing Can Be Used to Prove Theorems As Opposed to Establish Consistency Results
    THE METHOD OF FORCING JUSTIN TATCH MOORE Abstract. The purpose of this article is to give a presentation of the method of forcing aimed at someone with a minimal knowledge of set theory and logic. The emphasis will be on how the method can be used to prove theorems in ZFC. 1. Introduction Let us begin with two thought experiments. First consider the following “para- dox” in probability: if Z is a continuous random variable, then for any possible outcome z in R, the event Z 6= z occurs almost surely (i.e. with probability 1). How does one reconcile this with the fact that, in a truly random outcome, every event having probability 1 should occur? Recasting this in more formal language we have that, “for all z ∈ R, almost surely Z 6= z”, while “almost surely there exists a z ∈ R, Z = z.” Next suppose that, for some index set I, (Zi : i ∈ I) is a family of independent continuous random variables. It is a trivial matter that for each pair i 6= j, the inequality Zi 6= Zj holds with probability 1. For large index sets I, however, |{Zi : i ∈ I}| = |I| holds with probability 0; in fact this event contains no outcomes if I is larger in cardinality than R. In terms of the formal logic, we have that, “for all i 6= j in I, almost surely the event Zi 6= Zj occurs”, while “almost surely it is false that for all i 6= j ∈ I, the event Zi 6= Zj occurs”. It is natural to ask whether it is possible to revise the notion of almost surely so that its meaning remains unchanged for simple logical assertions such as Zi 6= Zj but such that it commutes with quantification.
    [Show full text]
  • Centered Forcing 3
    THE MODAL LOGIC OF σ-CENTERED FORCING AND RELATED FORCING CLASSES UR YA’AR Abstract. We consider the modality “ϕ is true in every σ-centered forcing extension”, denoted ϕ, and its dual “ϕ is true in some σ-centered forcing extension”, denoted ♦ϕ (where ϕ is a statement in set theory), which give rise to the notion of a principle of σ- centered forcing. We prove that if ZFC is consistent, then the modal logic of σ-centered forcing, i.e. the ZFC-provable principles of σ-centered forcing, is exactly S4.2. We also generalize this result to other related classes of forcing. 1. Introduction and preliminaries In this work we continue the investigation of the Modal Logic of Forcing, initiated by Joel Hamkins and Benedikt Löwe in [7], where they consider the modal logic arising from considering a statement as necessary (respectively possible) if it is true in any (res. some) forcing extension of the world. Here we restrict the modality only to extensions obtained by σ-centered focing notions, and prove that the modal logic arising from this interpretation is S4.2 (see below). We then show that our techniques can be generalized to other related classes of forcing notions. We begin by setting some preliminaries – first we cite common definitions and theo- rems of forcing and of modal logic; and then present the main tools developed in [7, 5] for the research of the modal logic of forcing; we add one new notion to this set of tools, the notion of an n-switch, and show it’s utility; and prove a general theorem (thm.
    [Show full text]