<<

arXiv:2104.13124v1 [cs.LO] 27 Apr 2021 Ln eso fteLC 01ppr ihfl rosi h a the in proofs full with paper, 2021 LICS the of version [Long n w roso h formula the of proofs two and te ahmtclfil.Freapei ru hoy two , group in from example are different For groups quite field. is theory mathematical first other respect, for of this identity w area proof In particular, of the . notion in in satisfactory provers no and have general still interactive in example and theory proof automated for in search, made progress proof efficient tableaux allow analytic of that developed implementation been have the systems proof other many [4], iue1 hwn w roso h formula the of proofs derivati two the showing Consider 1, are rules. bas Figure that systems, of proofs proof sets two different syntactic compare on different to two difficult within t is produced to it they tied And which inextricably in out. system carried are proof logic, based) rule in (inference syntactic studied usually are they field. mathematical [8] a 1900 as in existed lecture famous theory his proof for before [7] problem are 24th proofs possible two a when us telling same notion the such no have we theory, are spaces esuido hi w.SneGentzen’s Since co own. that their objects on [2], studied mathematical Hilbert growing be as science. by a citizens computer theory first- seen proof and became of has development of it the fields [1] Upon many Frege in by usage formalisation its Since acls(o)adi epifrnesse bto) tis It (bottom). system inference deep priori a a in and (top) calculus vr obntra ro steiaeo ytci proof. syntactic a of the the is completeness fi proof full for a combinatorial completeness (b) that every T and and proofs. soundness proofs, theorem syntactic of combinatorial proof for decomposition order simple form a inference normal (a) of deep yields kind a new a representa a is proof establishes via two proof the related that fro combinatorial show independent are We is rules. a proo inference that of proof deductive rules, a a inference of presentation in on -free formalized equ based (without are logic system proofs first-order syntactic for proofs Whereas syntactic and rial obntra rosadDcmoiinTheorems Decomposition and Proofs Combinatorial h anrao o hspolmi htfra ros as proofs, formal that is problem this for reason main The is-re rdct oi sacresoeo oenlogi modern of cornerstone a is logic predicate First-order Abstract o la o ocmaethem. compare to how clear not , vntog ibr a osdrn hspolmas problem this considering was Hilbert though even , W noe ls eainhpbtencombinato- between relationship close a uncover —We h same the h same the oii .D Hughes D. J. Dominic ..Berkeley U.C. oi Group Logic 5 or [5] f hyaehmoopi;ec nproof In etc. homeomorphic; are they iff f hyaeioopi;i oooy two topology, in isomorphic; are they iff .I I. USA NTRODUCTION ∃ x. ( px 6.Dsieteimmense the Despite [6]. o is-re Logic First-order for ∨ eun calculus ( ∀ y.py (( p )) cl oyehiu,LIX Polytechnique, Ecole ∨ nsequent in , ppendix.] q ni,Eup Partout Equipe Inria, ) uzStraßburger Lutz ∧ proofs -order any m n in ons p orem: ality). ) tions any [3], ∨ rst- uld are his France ed he c. p e f , , i.1 et ytci rosi eun acls(bv)a calculu sequent (above) propositi in same calculus calculus proofs the syntactic sequent Right: to (centre). in translate proof which proofs (below) syntactic structures Left: of 1. Fig. rtodrcmiaoilpof(centre). proof combinatorial first-order ≡ ac ≡ ai s wk s ∧ ∨ ctr ( ∨ ax (( ( p ( ( p ⊢ p p p t ⊢ p ⊢ ∧ ⊢ ∨ (( ⊢ ( ∧ ∨ p ≡ w ∨ ⊢ ,q p q, p, p (( (( ( p ( ∨ ,p p, ( ( p ∨ p ( q p ( p q KS1 ai p p p ∨ p p ) ,p q, ∨ ∨ ∨ ∨ ( ∨ ∨ ∨ ∧ ∨ ∨ q p p ↓ ↓ ↑ ↑ q ( q ) q q q q ∨ p p p ∨ q ) t )) ) ∧ )) ) )) p nrdcdi hsppr(eo) hc rnlt otesam the to translate which (below), paper this in introduced ) ) ) ∧ ∨ ∧ p q ∨ ax ∨ ∧ ∧ ∨ p ∨ ∧ )) ,p p p, p, q p ) p ( ( ,p p, q ) ) p p ⊢ ∨ )) ∧ p p ) ) ∨ p ,p p, ∨ ∨ p t ∨ ∨ ∨ p p p p p p ) ) ) cl oml Sup´erieure Normale Ecole m ctr ∃ ∃ ∃ ac ≡ ( ≡ ∃ u-sa Wu Jui-Hsuan ∃ ∃ ⊢ ∨ x. m ( ∀ ≡ ∃ ∃ w pw ∀ y. ⊢ x. x. ( ∀ x. ∀ ∀ px ∃ ⊢ (( ac (( (( pw ∃ y. y. France ∨ ( ⊢ x. pw ∨ pw, px px px wk (( (( ∨ ∃ x ( w pz, pw, ∨ ( ∀ ⊢ x. py pw px ( ∨ ∨ ∨ ∃ ⊢ ∨ y.py ∀ wk ( ⊢ ∀ w pz, pw, ( ∃ ai ∀ y.py px y.py, px ( ( ∨ py x. ∨ ∨ w pz, pw, y.py px ∀ ∀ ax x. ∀ ⊢ y.py py y.py ( )) ) ) (( py ∨ y. ( px ∃ ⊢ ∨ ∨ ))) w pz, pw, ∀ px ∀ ) ∨ ( x. ( ) ) ∃ y.py py ∀ ∀ pz, , (( ∨ ( pz ∨ )) )) ∨ x. ( ∃ ( ∃ ∨ y. y. ∨ aoe n h new the and (above) s t ∀ ∃ px pz, ( y x. x. , ( ( ( ∨ ∨ pw y.py x. ( t ∨ ( ( ∃ ∀ pz px py ) py ∃ ( ∀ ( nlcombinatorial onal x. y.py py ( ∨ ( px px ( pz ∨ x. ∀ y.py px px ∀ ∨ ∨ ∨ ( y.py ∨ ( ) ) ( dtecalculus the nd ( py y.py px ∀ ∨ ∨ px ∀ ( ( ( ∨ ∨ )) py ∨ y.py ∀ ∀ ∀ y.py )) ( ( ∨ y.py y.py y.py ( ( ∀ ( ∀ ∨ ))) ) ∀ ∀ ∀ y.py y.py ( y.py y.py y.py ( )) ∀ ))) ∀ y.py ))) ))) )) y.py )))) )) ))) ))) ))) )) ))) e This is where combinatorial proofs come in. They were system. A prominent example of a decomposition theorem introduced by Hughes [9] for classical propositional logic as is Herbrand’s theorem [21], which allows a separation be- a syntax-free notion of proof, and as a potential solution to tween the propositional part and the quantifier part in a first- Hilbert’s 24th problem [10] (see also [11]). The basic idea is order proof [4], [22]. Through the advent of deep inference, to abstract away from the syntax of the inference rules used new kinds of proof decompositions became possible, most in inductively-generated proofs and consider the proof as a notably the separation between the linear part of a proof combinatorial object, more precisely as a special kind of graph and the resource management of a proof. It has been shown homomorphism. For example, a propositional combinatorial by Straßburger [23] that a proof in classical propositional proof of Peirce’s law ((p⇒q)⇒p)⇒p = ((p ∨ q) ∧ p) ∨ p logic can be decomposed into a proof of multiplicative linear is shown mid-left in Fig.1, a homomorphism from a 4- logic, followed by a proof consisting only of contractions and vertex graph with two colours (above) to a 4-vertex graph weakenings (see also [10, §4]). In this paper we show that the labelled with propositional variables (below); dotted vertical same is possible for first-order logic. lines define the homomorphism, from above to below. Combinatorial proofs and deep inference can be seen as Several authors have illustrated how syntactic proofs in opposite ends of a spectrum: whereas deep inference allows various proof systems can be translated to propositional for a very fine granularity of inference rules—one inference combinatorial proofs: for sequent proofs in [10], for deep rule in a standard formalism, like or semantic inference proofs in [12], for Frege systems in [13], and for tableaux, is usually simulated by a sequence of different tableaux systems and resolution in [14]. This enables a natural deep inference rules—combinatorial proofs have completely definition of proof identity for propositional logic: two proofs abolished the concept of inference rule. And yet, there is are the same if they are mapped to the same combinatorial a close relationship between the two, realized through a proof. For example, the left side of Fig.1 translates syntactic decomposition theorem, as we establish in this paper. proofs from sequent calculus and the calculus of structures into the same combinatorial proofs, witnessing that the two Outline: This paper has three parts. First, in Sections II–V syntactic proofs, from different systems, are the same. we present the preliminaries on first-order logic, first-order Recently, Acclavio and Straßburger extended this notion to graphs, first-order combinatorial proofs, and the first-order KS1 relevant [15] and to modal logics [16], and Heijlties, proof system . Second, in Section VI we state the main Hughes and Straßburger have provided combinatorial proofs results. And third, in Sections VII–X we give their proofs. for intuitionistic propositional logic [17]. II. PRELIMINARIES: FIRST-ORDER LOGIC In this paper we advance the idea that combinatorial proofs can provide a notion of proof identity for first-order A. Terms and Formulas logic. First-order combinatorial proofs were introduced by Fix pairwise disjoint countably infinite sets VAR = Hughes in [18]. For example, a first-order combinatorial {x, y, z, . . .} of variables, FUN = {f, g, . . .} of proof of Smullyan’s drinker paradox ∃x(px ⇒ ∀ypy) = symbols, and PRED = {p, q, . . .} of predicate symbols. Each ∃x.(px ∨ (∀y.py)) is shown on the right of Fig.1, a homo- function and each predicate symbol has a finite arity. morphism from a 5-vertex partially coloured graph (with one Each predicate symbol p has a dual p with p 6= p. The colour) to a 4-vertex labelled graph. However, even though grammars below generate the set TERM of terms, denoted by Hughes proves soundness and completeness, the proof is un- s, t, u, . . ., the set ATOM of atoms, denoted by a, b, c, . . ., and satisfactory: (1) the soundness is long, intricate and the set FORM of formulas, denoted by A, B, C, . . .: cumbersome, and (2) the completeness proof does not allow a syntactic proof to be read back from a combinatorial proof, t ··= x | f(t1,...,tn) i.e., completeness is not sequentializable [19] nor full [20]. A a ··= t | f | p(t1,...,tn) | p(t1,...,tn) fundamental problem is that not all combinatorial proofs can A ··= a | A ∧ A | A ∨ A | ∃x.A | ∀x.A be obtained as translations of sequent calculus proofs. We solve these issues by moving to a deep inference system. where the arity of f and p is n. For better readability we More precisely, we introduce a new proof system, KS1, for often omit parentheses and write ft1 ...tn or pt1 ...tn. We first-order logic, that (a) reflects every combinatorial proof, consider the constants t (true) and f (false) as additional i.e., there is a surjection from KS1 proofs to combinatorial atoms, and consider all formulas in negation normal form, proofs, (b) yields simpler proofs of soundness and complete- where negation (·) is defined on atoms and formulas via De ness for combinatorial proofs, and (c) admits new decomposi- Morgan’s laws: tion establishing a precise correspondence between t = f p(t1,...,t )= p(t1,...,t ) certain syntactic inference rules and certain combinatorial n n f t notions. The right of Fig. 1 illustrates surjection in (a), and = p(t1,...,tn)= p(t1,...,tn) since the syntactic proofs in the two systems translate to the ∃x.A = ∀x.A A ∧ B = A ∨ B same combinatorial proof, they can be considered the same. ∀x.A = ∃x.A A ∨ B = A ∧ B In general, a decomposition theorem provides normal forms of proofs, separating of inference rules of a proof Note a = a. We write A ⇒ B as an abbreviation for A ∨ B.

2 A formula is rectified if all bound variables are distinct ⊢ Γ,A,B ⊢ Γ, A ⊢ B, ∆ from one another and from all free variables. Every for- ax ∨ ∧ mula can be transformed into a logically equivalent rectified ⊢ a, a ⊢ Γ, A ∨ B ⊢ Γ, A ∧ B, ∆ form by bound variable renaming, e.g. (px ∨ ∃xqx) ∧ ∃xr ⊢ Γ ⊢ Γ ⊢ ∆ 7→ (px ∨ ∃yqy) ∧ ∃zrz. If we consider formulas equivalent t f mix t f modulo bound variable renaming (α-conversion), the rectified ⊢ ⊢ Γ, ⊢ Γ, ∆ form of a formula A is unique, and we denote it by A. b ⊢ Γ, A[x/t] ⊢ Γ, A A substitution is a function σ : VAR → TERM that is the ∃ ∀ (x not free in Γ) identity almost everywhere. We denote substitutions as σ = ⊢ Γ, ∃x.A ⊢ Γ, ∀x.A [x1/t1,...,x /t ], where σ(x )= t for i =1..n and σ(x)= n n i i ⊢ Γ, A, A ⊢ Γ x for all x∈{ / x1,...,xn}. Write Aσ for the formula obtained ctr wk from A by applying σ, i.e., by simultaneously replacing all ⊢ Γ, A ⊢ Γ, A occurrences of xi by ti.A variable renaming is a substitution ρ with ρ(x) ∈ VAR for all variables x. Fig. 2. Sequent calculi LK1 (all rules) and MLL1X (rules in the dashed box) B. Sequent Calculus LK1

Sequents, denoted by Γ, ∆,..., are finite multisets of for- coloured if it carries a partial equivalence relation ∼G on 1 mulas, written as lists, separated by comma. The correspond- VG ; each equivalence class is a colour. A vertex renaming ing formula of a (non-empty) sequent Γ= A1, A2,...,An is of G = hVG , EG i along a bijection (ˆ·): VG → VˆG is the ˆ ˆ the disjunction of its formulas: W(Γ) = A1 ∨ A2 ∨···∨ An. graph G = hVG , {vˆwˆ | vw ∈ EG }i, with colouring and/or A sequent is rectified iff its corresponding formula is. labelling inherited (i.e., vˆ ∼ wˆ if v ∼ w, and ℓ(ˆv) = ℓ(v)). In this paper we use the sequent calculus LK1, shown in Following standard graph theory, we identify graphs modulo Figure 2, which is a one-sided variant of Gentzen’s original vertex renaming. calculus [3] for first-order logic. To simplify some technicali- A directed graph G = hVG , EGi is a set VG of vertices and a ties later in this paper, we include the mix rule. set EG ⊆ VG ×VG of direct edges.A directed graph homomor- phism ϕ: hV , E i → hV , E i is a function ϕ: V → V Theorem 1. LK1 is sound and complete for first-order logic. G G H H G H such that if (v, w) ∈ EG then (ϕ(v), ϕ(w)) ∈ EH. For a proof, see any standard textbook, e.g. [24]. The linear fragment of LK1, i.e., the fragment without B. Cographs ctr wk the rules (contraction) and (weakening) defines first- A graph H = hVH, EHi is a subgraph of a graph G = order multiplicative [19], [25] with mix [26], hVG , EG i if VH ⊆ VG and EH ⊆ EG. It is induced if MLL1X [27] (MLL1+mix). We denote that system here with v, w ∈ VH and vw ∈ EG implies vw ∈ EH. An induced (shown in Figure 2 in the dashed box). subgraph of G = hVG , EG i is uniquely determined by its set We will use the cut elimination theorem. The cut rule is of vertices V and we denote it by G[V ]. A graph is H-free if it ⊢ Γ, A ⊢ A, ∆ does not contain H as an induced subgraph. The graph P4 is cut (1) the (undirected) graph h{v1, v2, v3, v4}, {v1v2, v2v3, v3v4}i.A ⊢ Γ, ∆ cograph is a P4-free undirected graph. The interest in cographs Theorem 2. If a sequent ⊢ Γ is provable in LK1+cut then it for our paper comes from the following well-known fact. is also provable in LK1. Furthermore, if ⊢ Γ is provable in MLL1X+cut then it is also provable in MLL1X. Theorem 3 ([28], [29]). A graph is a cograph iff it can be constructed from the singletons via the operations + and ×. As before, this is standard, see e.g. [24] for a proof. In a graph G, the neighbourhood N(v) of a vertex v ∈ VG III. PRELIMINARIES: FIRST-ORDER GRAPHS is {w | vw ∈ EG }.A module is a set M ⊆ VG with N(v)\M = A. Graphs N(w) \ M for all v, w ∈ M. A module M is strong if for every module M ′ we have M ′ ⊆ M, M ⊆ M ′ or M ∩M ′ = ∅. A graph G = hVG , EG i is a pair where VG is a finite set of A module is proper if it has two or more vertices. vertices and EG is a finite set of edges, which are two- subsets of VG . We write vw for an edge {v, w}. C. Fographs Let G = hVG , EG i and H = hVH, EHi be graphs such that A cograph is logical if every vertex is labelled by either an VG ∩ VH = ∅. A homomorphism ϕ: G → H is a function atom or variable, and it has at least one atom-labelled vertex. ϕ: VG → VH such that if vw ∈ EG then ϕ(v)ϕ(w) ∈ EH. An atom-labelled vertex is a literal and a variable-labelled The union G + H is the graph hVG ∪ VH, EG ∪ EHi and the vertex is a binder. A binder labelled with x is an x-binder. join G × H is the graph hVG ∪ VH, EG ∪ EH ∪{vw | v ∈ The scope of a binder b is the smallest proper strong module VG , w ∈ VH}i. A graph G is disconnected if G = G1 + G2 for two non-empty graphs G1, G2, otherwise it is connected. 1In [9] and [18] adjacent vertices must have distinct colours, following A graph G is labelled in a set L if each vertex v ∈ VG the standard definition of colouring in graph theory. We choose to omit this has an associated label ℓ(v) ∈ L. A graph G is (partially) condition here, as it is implied by the preclusion of bimatchings in Def. 10.

3 containing b. An x-literal is a literal whose atom contains the variable x. An x-binder binds every x-literal in its scope. In px pz px pz x x a logical cograph G, a binder b is existential (resp. universal) z z if, for every other vertex v in its scope, we have bv ∈ EG y y (resp. bv∈ / EG ). An x-binder is legal if its scope contains no qy qfz qy qfz other x-binder and at least one literal. Definition 4 ([18, §3]). A first-order graph or fograph G is a Fig. 3. A fonet (left) with dualizer [x/z,y/fz] and its leap graph (right). logical cograph whose binders are all legal. The binding graph of G is the directed graph G~ = hVG , {(b,l) | b binds l}i. We define a mapping J·K from formulas to (labelled) graphs, IV. FIRST-ORDER COMBINATORIAL PROOFS inductively as follows: A. Fonets JaK = •a (for any atom a) Two atoms are pre-dual if they are not t or f, and their JA ∨ BK = JAK + JBK J∃x.AK = •x × JAK predicate symbols are dual (e.g. p(x, y) and p(y,z)) and two JA ∧ BK = JAK × JBK J∀x.AK = •x + JAK literals are pre-dual if their labels (atoms) are pre-dual. A where we write •α for a single-vertex labelled by α. linked fograph hC, ∼Ci is a coloured fograph C such that every colour (i.e., equivalence class of ∼C), called a link, consists Example 5. Here is the fograph of the drinker formula of two pre-dual literals, and every literal is either t-labelled or ∃x(px ⇒ ∀ypy)= ∃x.(px ∨ (∀y.py)): x py in a link. Hence, in a linked fograph no vertex is labelled f. y px Let C be a linked fograph. The set of links can be seen as Lemma 6. If A is a rectified formula then JAK is a fograph. a unification problem by identifying dual predicate symbols. A dualizer of C is a substitution δ unifying all the links of C. Proof. That JAK is a logical cograph follows immediately from Since a first-order unification problem is either unsolvable the definition and Theorem 3. The fact that every binder of or has a most general unifier, we can define the notion of JAK is legal can be proved by structural induction on A. most general dualizer.A dependency is a pair {•x, •y} of an Remark 7. Note that JAK need not be a fograph if A is not existential binder •x and a universal binder •y such that the most general dualizer assigns to x a term containing y.A leap rectified. If A = (∀x.px) ∨ (∀x.qx), then JAK = •x • px • L x •qx, the scope of each x-binder contains all the vertices, in is either a link or a dependency. The leap graph C of C is particular, the other x-binder. On the other hand, there are non- the undirected graph hVC ,LCi where LC is the set of leaps of rectified formulas which are translated to fographs by J·K. For C. A vertex set W ⊆ VC induces a matching in C if W 6= ∅ example, in the graph of (∃x.px)∨(∃x.qx), both x-binders are and for all w ∈ W , N(w) ∩ W is a singleton. We say that W legal, as they are not in each other’s scope: x px x qx. induces a bimatching in C if it induces a matching in C and a matching in CL. We define a congruence relation ≡ on formulas, called equiv- alence, by the following equations: Definition 10 ([18, §5]). A first-order net or fonet is a linked fograph which has a dualizer but no induced bimatching. A ∧ B ≡ B ∧ A (A ∧ B) ∧ C ≡ A ∧ (B ∧ C) A ∨ B ≡ B ∨ A (A ∨ B) ∨ C ≡ A ∨ (B ∨ C) (2) Figure 3 shows a fonet with its dualizer and leap graph. ∀x.∀y.A ≡ ∀y.∀x.A ∀x.(A ∨ B) ≡ (∀x.A) ∨ B ∃x.∃y.A ≡ ∃y.∃x.A ∃x.(A ∧ B) ≡ (∃x.A) ∧ B B. Skew Bifibrations where x must not be free in B in the last two equations. A graph homomorphism ϕ: hVG , EGi → hVH, EHi is a Theorem 8 ([18, §10]). Let A, B be rectified formulas. Then fibration [30], [31] if for all v ∈ VG and wϕ(v) ∈ EH, there exists a unique w˜ ∈ V such that wv˜ ∈ E and ϕ(˜w) = w A ≡ B ⇐⇒ JAK = JBK G G (indicated below-left), and is a skew fibration [9, §3] if for Proof. A straightforward structural induction on formulas. all v ∈ VG and wϕ(v) ∈ EH there exists w˜ ∈ VG such Example 9. ∃x.(px ∨ (∀y.py)) ≡ ∃x∀y (py ∨ px), and both that wv˜ ∈ EG and ϕ(˜w)w∈ / EH (indicated below-centre). A formulas have the same (rectified) fograph D, below-left. directed graph homomorphism is a fibration if for all v ∈ VG and (w, ϕ(v)) ∈ E , there exists a unique w˜ ∈ V such that × H G x px x px (˜w, v) ∈ EG and ϕ(˜w)= w (indicated below-right). + x ∃w ∃!w v e v ∃!w v y py y py e e y px py Above-center we show the cotree of the underlying cograph ϕ(w) (illustrating the idea behind Theorem 3) and above-right is its w ϕ(v) e ϕ(v) w ϕ(v) binding graph D~ . w

4 x px x px y pffy py x px y py pfy qab qba x y qxy pa pb pfx (∀xpx) ⇒ ∀y (py ∧pfy) ∃x(pa∨ pb ⇒ px) qab ∨ qba ⇒ ∃x ∃y qxy ∀x(pfx ⇒px) ⇒ ∀y (pffy ⇒py)

Fig. 4. Four combinatorial proofs, each shown above the formula proved. Here x and y are variables, f is a unary function symbol, a and b are constants (nullary function symbols), p is a unary predicate symbol, and q is a binary predicate symbol. For each skew bifibration ϕ, the variable substitution ρϕ is an identity, thus we can omit labels from each (coloured) source fograph (since the label of v in the source is that of ϕ(v) in the target).

(∀xpx) ⇒ ∀y (py ∧pfy) ∃x(pa∨ pb ⇒ px) qab ∨ qba ⇒ ∃x ∃y qxy ∀x(pfx ⇒ px) ⇒ ∀y (pffy ⇒ py)

Fig. 5. Condensed forms of the four combinatorial proofs in Figure 4. We do not show the lower graph, and indicate the mapping by the position of the vertices of the upper graph.

A fograph homomorphism ϕ = hϕ, ρ i is a pair where ϕ S{t} S{t} S{A} ϕ: G → H is a graph homomorphism between the underlying ∀ ai t t t graphs, and ρϕ, also called the substitution induced by ϕ, S{∀x. } S{a ∨ a} S{A ∧ } is a variable renaming such that for all v ∈ V we have G S{A ∧ (B ∨ C)} S{A ∧ B} ℓ(ϕ(v)) = ρϕ(ℓ(v)), and ρϕ is the identity on variables not s mix in G. Note that ϕ necessarily maps binders to binders and S{(A ∧ B) ∨ C} S{A ∨ B} literals to literals. Since ρϕ is fully determined by ϕ alone, we S{A[x/t]} S{B} often leave ρϕ implicit. A fograph homomorphism ϕ: G → H ∃ ≡ (where A ≡ B) preserves existentials if for all existential binders b in G, the S{∃x.A} S{A} binder ϕ(b) is existential in H. S{A} S{(A ∧ C) ∨ (B ∧ D)} S{a ∨ a} w m ac Definition 11 ([18, §4]). Let G and H be fographs. A skew S{A ∨ B} S{(A ∨ B) ∧ (C ∨ D)} S{a} bifibration ϕ: G → H is an existential-preserving fograph homomorphism that is a skew fibration on hV , E i → S{(∀x.A) ∨ (∀x.B)} S{(∃x.A) ∨ (∃x.B)} G G m m ~ ~ ∀ ∃ hVH, EHi and a fibration on the binding graphs G → H. S{∀x.(A ∨ B)} S{∃x.(A ∨ B)} Example 12. Below-left is a skew bifibration, whose binding S{A} S{∀x.∀x.A} fibration is below-centre. When the labels on the source w∀ (x not free in A) c∀ fograph can be inferred (modulo renaming), we often omit S{∀x.A} S{∀x.A} the labelling in the upper graph, as below-right. X x py x py Fig. 6. Deep inference systems KS1 (all rules) and MLS1 (rules in the dashed box) x y x y px px than the original definition of [18], as we allow for a variable x py x py x py renaming ρ which was restricted to be the identity in [18]. y y y ϕ px px px V. FIRST-ORDER DEEP INFERENCE SYSTEM KS1 Definition 13 . first-order combinatorial proof ([18, §6]) A In contrast to standard proof formalisms, like sequent calculi (FOCP) of a fograph G is a skew bifibration ϕ: C → G where or tableaux, where inference rules decompose the principal first-order combinatorial proof C is a fonet. A of a formula formula along its root connective, deep inference rules apply A is a combinatorial proof of its graph JAK. like rewriting rules inside any (positive) formula or sequent Figure 4 shows examples of FOCPs (taken from [18]), each context, which is denoted by S{·}, and which is a formula above the formula it proves. The same FOCPs are in Figure 5 (resp. sequent) with exactly one occurrence of the hole {·} in in condensed form, with the formula graph left implicit. the position of an atom. Then S{A} is the result of replacing the hole {·} in S{·} with A. Theorem 14 ([18, §6]). FOCPs are sound and complete for Figure 6 shows the inference rules for the deep inference first-order logic. system KS1 introduced in this paper. It is a variation of the Remark 15. Our definition of FOCP is slightly more lax systems presented by Br¨unnler [32] and Ralph [33] in their

5 PhD-theses. The main differences are (i) the explicit presence t ∀ of the mix-rule, (ii) a different choice of how the formula ∀y.t t equivalence ≡ is defined, (iii) an explicit rule for the equiv- ∀y.(t ∧ t) w c ai alence, and (iv) new inference rules ∀ and ∀. The reason ∀y.((py ∨ py) ∧ t) behind these design choices is to obtain the correspondence ai ∀y.((py ∨ py) ∧ (pfy ∨ pfy)) with combinatorial proofs and the full completeness result. ≡ ∀y.(py ∨ (py ∧ (pfy ∨ pfy))) We consider here only the cut-free fragment, as cut- s ∀y.(py ∨ ((py ∧ pfy) ∨ pfy)) elimination for deep inference systems has already been dis- ≡ cussed elsewhere (e.g. [22], [34]).2 As with the sequent system ∀y.((py ∨ pfy) ∨ (py ∧ pfy)) X ∃ LK1, we also need for KS1 the linear fragment, MLS1 , and ∀y.((py ∨ (∃x.px)) ∨ (py ∧ pfy)) ∃ that is shown in Figure 6 in the dashed box. ∀y.(((∃x.px) ∨ (∃x.px)) ∨ (py ∧ pfy)) B ≡ ((∃x.px) ∨ (∃x.px)) ∨ (∀y.(py ∧ pfy) We write S Φ to denote a derivation Φ from B to A using m∃ (∃x.(px ∨ px) ∨ (∀y.(py ∧ pfy)) A ac the rules from system S. A formula A is provable in a system (∃x.px) ∨ (∀y.(py ∧ pfy)) S if there is a derivation in S from t to A. Fig. 7. Example derivation in decomposed form of Theorem 17 We will for some results also employ the general (non- atomic) version of the contraction rule: Theorem 17 is also the reason why we have the rules w∀ and S{A ∨ A} c KS1 c (3) ∀ in system , as these rules are derivable with the other S{A} rules. However, they are needed to obtain this decomposition. Figure 7 shows an example of a decomposed derivation in VI. MAIN RESULTS KS1 of the formula (∃x.px) ∨ (∀y.(py ∧ pfy)). We state the main results of this paper here, and prove them A weaker version of Theorem 17 will also be useful: in later sections. The first is routine and expected, but must t be proved nonetheless: Theorem 18. For every derivation KS1 Φ there is a formula A′ Theorem 16. KS1 is sound and complete for first-order logic. A with no occurrence of f and a derivation Our second result is more surprising, as it is a very strong t decomposition result for first-order logic. MLS1X t A′ Theorem 17. For every derivation KS1 Φ there are f-free {w,c,≡} A formulas A1,...,A5 and a derivation A ′ t Here A corresponds to A3 of Theorem 17. We now establish the connection between derivations in {∀,ai,t} KS1 and combinatorial proofs. A5 {s,mix,≡} Theorem 19. Let ϕ: C → A be a combinatorial proof and A4 let A be a formula with A = JAK. Then there is a derivation {∃} t A3 X MLS1 Φ1 {m,m∀,m∃,≡} ′ A2 A (4) w w ac c m m m Φ {ac,c∀} { , ∀, , ∀, , ∀, ∃,≡} 2 A1 A {w,w∀,≡} ′ for some A ≡ Cρϕ where C is a formula with JCK = C A and ρϕ is the variable renaming substitution induced by ϕ. This theorem is stronger than the existing decompositions for Conversely, whenever we have a derivation as in (4) above, first-order logic, which either separate only atomic contraction such that f does not occur in A′, then there is a combinatorial and atomic weakening [32] or only contraction [33] or only proof ϕ: C → JAK such that C = JA′K. c the quantifiers in form of a Herbrand theorem [35], [33]. Furthermore, in the proof of Theorem 19, we will see that

2 (i) the links in the fonet C correspond precisely to the pairs of In the deep inference literature, the cut-free fragment is also called the ai down-fragment. But as we do not discuss the up-fragment here, we omit the atoms that meet in the instances of the -rule in the derivation down-arrows ↓ in the rule names. Φ1, and (ii) the ”flow-graph” of Φ2 that traces the quantifier-

6 S{A} and atom-occurrences in the derivation corresponds exactly rule instance r . By induction hypothesis we have a to the vertex-mapping induced by ϕ. To give an example, S{B} consider the derivation in Figure 7 which corresponds to the MLL1X proof Π of ⊢ S{A}. By Lemmas 21 and 22, we have left-most combinatorial proof in Figures 4 and 5. a MLL1X proof of ⊢ S{A},S{B}. We can compose them via Thus, combinatorial proofs are closely related to derivations ⊢ S{A} ⊢ S{A},S{B} of the form (4), and since by Theorem 17 every derivation can cut be transformed into that form, we can say that combinatorial ⊢ S{B} proofs provide a canonical proof representation for first-order and then apply Theorem 2. logic, similarly to what proof nets are for linear logic [36]. Finally, Theorems 16, 17 and 19 imply Theorem 14, which B. Contraction and Weakening means that we have here an alternative proof of the soundness The first observation here is that Lemmas 20–23 from above and completeness for first-order combinatorial proofs which also hold for LK1 and KS1. We therefore immediately have: is simpler than the one given in [18], and improves with completeness being full (a surjection from syntactic KS1 Theorem 24. For every sequent Γ, we have that ⊢ Γ is LK1 KS1 proofs onto combinatorial proofs). provable in W(Γ) is provable in . Then Theorem 16 is an immediate consequence. Let us now VII. TRANSLATING BETWEEN LK1 AND KS1 proceed with providing further lemmas that will be needed for We prove Theorems 16, 17, and 18, mainly by translating the other results. derivations to and from the sequent calculus, and by rule permutation . Lemma 25. The c-rule is derivable in {ac, m, m∀, m∃, ≡}. A. The Linear Fragments MLL1X and MLS1X Proof. This can be shown by a straightforward induction on X X A (for details, see e.g. [32]). We show that MLL1 and MLS1 are equivalent. X Lemma 26. w , c , m, m , m are derivable in {w, c, ≡}. Lemma 20. If ⊢ Γ is provable in MLL1 then W(Γ) is provable ∀ ∀ ∀ ∃ X in MLS1 . Proof. We only show the cases for w∀ and c∀ (for the others see [32]): Proof. This is a straightforward induction on the proof of ⊢ Γ A ∀x.∀x.A in MLL1X, making a case analysis on the bottommost rule w w A ∨ (∀x.A) ∀x.((∀x.A) ∨ A) ⊢ ∆, A ≡ ≡ (5) instance. We show here only the case of ∀ (all other ∀x.(A ∨ A) (∀x.A) ∨ (∀x.A) ⊢ ∆, ∀x.A c c cases are simpler or have been shown before, e.g. [32]): By ∀x.A ∀x.A induction hypothesis, there is a proof of (∆) ∨ A in MLS1X. W where in the first derivation, x is not free in A, and in the We can prefix every line in that proof by ∀x and then compose second one not free in ∀x.A. the following derivation: t ∀ Lemma 27. Let A and B be formulas. Then ∀x.t MLS1X A A ∀x. W(∆) ∨ A {w,c,≡} ⇐⇒ {w,w∀,ac,c∀,m,m∀,m∃,≡} ≡ W(∆) ∨ ∀x.A B B where we can apply the ≡-rule because x is not free in ∆. Proof. Immediately from Lemmas 25 and 26.

S{A} X Remark 28. Observe that Lemma 27 would also hold with Lemma 21. Let r be an inference rule in MLS1 . Then w c S{B} the rules ∀ and ∀ removed. MLL1X the sequent ⊢ A, B is provable in . C. Rule Permutations Proof. A routine exercise. Theorem 29. Let Γ be a sequent. If ⊢ Γ is provable in LK1 (as depicted on the left below) then there is a sequent ′ not Lemma 22. Let A, B be formulas, and let S{·} be a (pos- Γ X containing any f, such that there is a derivation as shown on itive) context. If ⊢ A, B is provable in MLL1 , then so is the right below: ⊢ S{A},S{B}.

X Proof. A straightforward induction on S{·}. (see e.g. [37]) MLL1 Φ1 LK1 Φ =⇒ ⊢ W(Γ′) X Lemma 23. If a formula C is provable in MLS1 then ⊢ C ⊢ Γ {w,c,≡} Φ2 X is provable in MLL1 . ⊢ W(Γ) Proof. We proceed by induction on the number of inference Proof. First, we can replace every instance of the f-rule in steps in the proof of C in MLS1X. Consider the bottommost Φ by wk. Then the instances of wk and ctr are replaced by

7 w and c, which can then be permuted down. Details are in VIII. FONETS AND LINEAR PROOFS Appendix A. X t A. From MLL1 Proofs to Fonets Lemma 30. For every derivation MLS1X there are formulas Let Π be a MLL1X proof of a rectified sequent ⊢ Γ not A containing f. We now show how Π is translated into a linked A′ and A′′ such that t fograph JΠK = hJΓK, ∼Πi. We proceed inductively, making a {∀,ai,t} case analysis on the last rule in Π. At the same time we are ′′ A constructing a dualizer δΠ, so that in the end we can conclude {s,mix,≡} that JΠK is in fact a fonet. A′ 1) Π is ax : Then the only link is {a, a}, and δΠ is {∃} ⊢ a, a A empty. 2) Π is t : Then ∼Π and δΠ are both empty. ⊢ t Proof. First, observe that the ∃ rule can be permuted under all ⊢ Γ′ ⊢ Γ′′ the other rules since A[x/t] has the same structure as A and 3) The last rule in Π is mix : By induction ⊢ Γ′, Γ′′ none of the other rules has a premise of the form S{∃x.A}. It hypothesis, we have proofs Π′ and Π′′ of Γ′ and Γ′′, suffices now to prove that all rules in {∀, ai, t} can be permuted respectively. We have JΓK = JΓ′K + JΓ′′K and we can let over the rules in {s, mix, ≡}, which is straightforward. For s, ∼Π = ∼Π′ ∪∼Π′′ and δΠ = δΠ′ ∪ δΠ′′ . mix, and the ≡-instances that do not involve the quantifiers, ⊢ Γ1,A,B the details can be found in [38]. The ≡-instances concerning 4) The last rule in Π is ∨ : By induction the quantifiers are admissible if the ∃-rule is not present. ⊢ Γ1, A ∨ B hypothesis, there is a proof Π′ of Γ′ = Γ1,A,B. We A have JΓK = JΓ′K and let ∼Π = ∼Π′ and δΠ = δΠ′ .

Lemma 31. For every derivation {w,w∀,ac,c∀,m,m∀,m∃,≡} ⊢ Γ1, A ⊢ B, Γ2 5) The last rule in Π is ∧ : By induction B ⊢ Γ1, A ∧ B, Γ2 ′ ′ there are formulas A and B such that hypothesis, we have proofs Π′ and Π′′ of Γ′ = Γ1, A and Γ′′ = B, Γ2, respectively. We have JΓK = JΓ1K + A (JAK × JBK)+ JΓ2K and we let ∼Π = ∼Π′ ∪∼Π′′ and m m m { , ∀, ∃,≡} δΠ = δΠ′ ∪ δΠ′′ . ′ A ⊢ Γ1, A[x/t] {ac,c∀} 6) The last rule in Π is ∃ : By induction hy- ⊢ Γ1, ∃x.A ′ B pothesis, there is a proof Π′ of Γ′ =Γ1, A[x/t]. For each ′ {w,w∀,≡} atom in Γ =Γ1, A[x/t], there is a corresponding atom B in Γ=Γ1, ∃x.A. We can therefore define the linking ∼Π from the linking ∼Π′ via this correspondence. Then, we w w Proof. Permute all and ∀ instances to the bottom of the let δΠ be δΠ′ + [x/t]. Since Γ is rectified x does not yet c c m m m derivation, then permute all and ∀ below { , ∀, ∃}. This occur in δΠ′ . Hence δΠ is a dualizer of JΠK. involves a tedious but routine case analysis. However, unlike ⊢ Γ1, A most other rule permutations in this paper, this has not been 7) The last rule in Π is ∀ (x not free in Γ1) : done before in the deep inference literature. For this reason, ⊢ Γ1, ∀x.A By induction hypothesis, there is a proof ′ of ′ we give the full case analysis in Appendix B. This Lemma is Π Γ = Γ1, A, which has the same atoms as in Γ=Γ1, ∀x.A. the reason for the presence of the rules w∀ and c∀, as without Hence, we can let Π Π′ and Π Π′ . them the permutation cases in (5) could not be resolved. ∼ = ∼ δ = δ Theorem 32. If Π is a MLL1X proof of a rectified f-free We can now complete the proof of Theorems 17 and 18. sequent ⊢ Γ, then JΠK is a fonet and δΠ a dualizer for it.

Proof of Theorem 18. Assume we have a proof of A in KS1. Proof. We must show that none of the operations above By Theorem 24 we have a proof of ⊢ A in LK1 to which we introduces a bimatching. For cases 1–6, this is immediate. can apply Theorem 29. Finally, we apply Lemma 20 to get For case 7, observe that there is a potential dependency from the desired shape. each existential binder in JΓ′K to the new x-binder •x in JΓK. However, observe that this •x vertex is not connected to any Proof of Theorem 17. Assume we have a proof of A in KS1. vertex in JΓ′K, and hence no such new dependency can be We first apply Theorem 18, and then Lemma 30 to the upper extended to a bimatching. That δΠ is a dualizer for JΠK follows half and Lemmas 27 and 31 to the lower half. immediately from the construction. Hence, JΠK is a fonet.

8 X B. From MLS1 Proofs to Fonets Then ∼C⋆ consists of the pairs induced by ∼C and the new ⋆ X pairs {q , q } introduced in step 1 above. We call C the There is a more direct path from a MLL1 proof Π of i,j i,j frame of C and we define the frame of C, denoted C⋆, as a rectified sequent Γ to the linked fograph JΠK: take the ⋆ hJC K, ∼C⋆ i. fograph JΓK, and let the equivalence classes of ∼Π be all the atom pairs that meet in an instance of ax, and δΠ comprises Lemma 34. If a linked fograph C has an induced bimatching the substitutions at the ∃-rules in Π. We chose the more then so does its frame C⋆. cumbersome path above because it gives us a direct proof of X Proof. Immediately from the construction of the frame. Theorem 32. However, for translating MLS1 derivation into fonets, we employ exactly that direct path. Proof of Theorem 33. From Φ we construct a derivation Φ⋆ X In a derivation in MLS1 where the conclusion is recti- of A⋆ in the propositional fragment of MLS1X, such that fied, every line is also rectified, as the only rules involving JΦ⋆K = JΦK⋆. The rules ai, t, mix and s are translated trivially, bound variables are ∀ and ∃ which (upwards) both remove and for ≡, it suffices to observe that the frame construction is a binder. Therefore, we can call such a derivation rectified, invariant under ≡. Finally, for the rules ∀ and ∃, proceed as X and for a non-rectified MLS1 derivation Φ we can define its follows. Every instance of ∀ is replaced by the derivation on rectification Φ inductively, by rectifying each line, proceeding the right below:4 t b 3 step-wise from conclusion to premise. {ai,t} Ψ1 S{t} t t ∀ S{(qh1,j ∨ qh1,j ) ∧···∧ (qhn,j ∨ qhn,j) ∧ } MLS1X Φ S{∀yj.t} A rectified derivation determines a substitution {s,≡} Ψ2 A t S{qh1,j ∨···∨ qhn,j ∨ (qh1,j ∧···∧ qhn,j ∧ )} which maps the existential bound variables occurring in A to the terms substituted for them in the instances of the ∃-rule where h1,...,hn range over the indices of the existential in Φ. We denote this substitution by δΦ and call it the dualizer binders dependent on that yj . It is easy to see how Ψ1 is of Φ. Furthermore, every atom occurring in the conclusion A constructed. The construction of Ψ2, using s and ≡, is stan- must be consumed by a unique instance of the rule ai in Φ. dard, see, e.g. [40], [37], [41], [38]. Then, every occurrence of

This allows us to define a (partial) equivalence relation ∼Φ on ∀yj .F is replaced by qh1,j ∨···∨qhn,j ∨(qh1,j ∧···∧qhn,j ∧F ) the atom occurrences in A by a ∼Φ b if a and b are consumed in the derivation below that ∀-instance. Now, observe that by the same instance of ai in Φ. We call ∼Φ the linking of Φ, all instances of the ∃-rule introducing xi dependent on yj and define JΦK = hJAK, ∼Φi. must occur below in the derivation (otherwise Φ would not S{B[xi/t]} t be rectified). Now consider such an instance ∃ . S{∃xi.B} Theorem 33. Let MLS1X Φ be a rectified derivation where A Its context S{·} must contain all the ∀yj the ∃xi depends on, A such that B is in their scope. Following the translation of the f is -free. Then JΦK is a fonet and δΦ a dualizer for it. ∀ rules above, we can therefore translate the ∃-rule instance To prove this theorem, we have to show that no inference by the following derivation X rule in MLS1 can introduce a bimatching. To simplify the ′ S0{qi,k1 ∧ S1{qi,k2 ∧ · · · Sl−1{qi,kl ∧ Sl{B }}···}} argument, we introduce the frame [39] of the linked fograph {s,≡} Ψ3 C, which is a linked (propositional) cograph in which the S0{S1{··· S 1{S {q ∧ q ∧ · · · q ∧ B′}}···}} dependencies between the binders in C are encoded as links. l− l i,k1 i,k2 i,kl More formally, let C be a formula with JCK = C, to which where k1,...,kl are the indices of the universal binders on ′ we exhaustively apply the following subformula rewriting which that xi depends, and B is B in which all predicates steps, to obtain a sequent C⋆: are replaced by a nullary one (step 3 in the frame construction). 1) Encode dependencies as fresh links. For each depen- The derivation Ψ3 can be constructed in the same way as Ψ2. Doing this to all instances of the rules ∀ and ∃ in Φ dency {•xi, •yj} in C, with corresponding subformulas yields indeed a propositional derivation Φ⋆ with JΦ⋆K = JΦK⋆. ∃xi.A and ∀yj .B in C, we pick a fresh (nullary) predi- It has been shown by Retor´e [42] and rediscovered by cate symbol qi,j , and then replace ∃xi.A by qi,j ∧∃xi.A, ⋆ ⋆ Straßburger [38] that JΦ K = hJC K, ∼Φ⋆ i cannot contain an and replace ∀yj .B by qi,j ∨ ∀yj .B. 2) Erase quantifiers. After step 1, remove all the quanti- induced bimatching. By Lemma 34, JΦK does not have an induced bimatching either. Furthermore, it follows from the fiers, i.e., replace ∃xi.A by A and replace ∀yj .B by B everywhere. definition of δΦ that it is a dualizer for JΦK. 3) Simplify atoms. After step 2, replace every predicate Remark 35. There is an alternative path of proving Theo- pt1 ...tn (resp. pt1 ...tn) with a nullary predicate sym- rem 33 by translating Φ to an MLL1X-proof Π, observing that bol p (resp. p) this process preserves the linking and the dualizer. However,

3As for formulas, the rectification of a derivation is unique up to renaming 4For better readability we omit superfluous parentheses, knowing that we of bound variables. always have ≡ incorporating associativity and commutativity of ∧ and ∨.

9 for this, we have to extend the construction from the previous D. From Fonets to MLS1X Proofs cut subsection to the -rule, and then show that linking and du- We can now straightforwardly obtain the same result for alizer of a sequent proof Π are invariant under cut elimination. MLS1X: This can be done similarly to unification nets in [39]. Theorem 38. Let hC, ∼Ci be a fonet, and let C be a formula t C. From Fonets to MLL1X Proofs with JCK = C. Then there is a derivation MLS1X Φ such that Now we are going to show how from a given fonet hC, ∼Ci C we can construct a sequent proof Π in MLL1X such that JΠK = JΦK = hC, ∼Ci. hC, ∼Ci. In the proof net literature, this operation is also called sequentialization. The basic idea behind our sequentialization Proof. We apply Theorem 37 to obtain a sequent proof Π is to use the frame of , to which we can apply the splitting of ⊢ C with JΠK = hC, ∼Ci. Then we apply Lemma 20, ob- C X X tensor theorem, and then reconstruct the sequent proof Π. serving that the translation from MLL1 to MLS1 preserves Let Γ be a propositional sequent and ∼Γ be a linking for linking and dualizer. JΓK. A conjunction formula A ∧ B is splitting or a splitting Remark 39. Note that it is also possible to do a direct ′ ′′ tensor if Γ=Γ , A ∧ B, Γ and ∼Γ = ∼1∪ ∼2, such that “sequentialization” into the deep inference system MLS1X, ′ ′′ ∼1 is a linking for JΓ , AK and ∼2 is a linking for JB, Γ K, using the techniques presented in [38] and [46]. i.e., removing the ∧ from A ∧ B splits the linked fograph hJΓK, ∼Γi into two fographs. We say that hJΓK, ∼Γi is mixed IX. SKEW BIFIBRATIONS AND RESOURCE MANAGEMENT iff ′ ′′ and Γ 1 2, such that 1 is a linking Γ=Γ , Γ ∼ = ∼ ∪ ∼ ∼ In this section we establish the relation between skew for JΓ′K and ∼2 is a linking for JΓ′′K. Finally, hJΓK, ∼Γi is bifibrations and derivations in {w, w , ac, c , m, m , m , ≡}. splittable if it is mixed or has a splitting tensor. ∀ ∀ ∀ ∃ However, if a derivation Φ contains instances of the rules c∀, Theorem 36. Let Γ be a f-free propositional sequent contain- m∀, and m∃ we can no longer naively define the rectification X ing only atoms and -formulas, and Γ be a linking for . Φ as in the previous section for MLS1 , as these two rules ∧ ∼ JΓK b If hJΓK, ∼Γi does not induce a bimatching then it is splittable. cannot be applied if premise and conclusion are rectified. For this reason we define here rectified versions c∀, m∀ and m∃, This is the well-known splitting-tensor-theorem [19], [43], shown below: b c c mix adapted for the presence of . In the setting of linked S{(∀y.Ay) ∨ (∀z.Bz)} cographs, it has first been proved by Retor´e[44], [45] and m∀ S{∀x.(Ax ∨ Bx)} then rediscovered by Hughes [9]. We use it now for our S{∀y.∀x.Ax} c c∀ sequentialization: b S{∀x.Ax} S{(∃y.Ay) ∨ (∃z.Bz)} m∃ Theorem 37. Let hC, ∼Ci be a fonet, and let Γ be a sequent c S{∃x.(Ax ∨ Bx)} MLL1X with JΓK = C. Then there is an -proof Π of Γ, such Here, we use the notation A· for a formula A with occurrences that JΠK = hC, ∼Ci. of a placeholder · for a variable. Then Ax stands for the results

Proof. Let δC be the dualizer of hC, ∼Ci. We proceed by of replacing that placeholder with x, and also indicating that induction on the size of Γ (i.e., the number of symbols in it, x must not occur in A·. Then ∀x.Ax and ∀y.Ay are the same without counting the commas). If Γ contains a formula with ∨- formula modulo renaming of the bound variable bound by the root, or a formula ∀x.A, we can immediately apply the ∨-rule outermost ∀-quantifier. We also demand that the variables x, or the ∀-rule of MLL1X and proceed by induction hypothesis. y, and z do not occur in the context S{·}. m m If Γ contains a formula ∃x.A such that the corresponding Note that in an instance of ∀ or ∃ (as shown above), c c binder •x in C has no dependency, then we can apply the we can have x = y or x = z, but not both if the premise is m m ∃-rule, choosing the term t as determined by δC, and proceed rectified. If x = y and x = z we have ∀ and ∃ as special by induction hypothesis. Hence, we can now assume that cases of m∀ and m∃, respectively. And similarly, if x = y then c c c c Γ contains only atoms, ∧-formulas, or formulas of shape ∀ is a special case of ∀. bw w ac c m m m ∃x.A, where the vertex •x has dependencies. Then the frame For a derivation Φ in { , ∀, , ∀, , ∀, ∃, ≡}, we can ⋆ Γ⋆ does not induce a bimatching and contains only now construct the rectification Φ by rectifying each line of Φ, hJΓ K, ∼ i b atoms and ∧-formulas, and is therefore splittable. If it is mixed, yielding a derivation in {w, w∀, ac, c∀, m, m∀, m∃, ≡}. then we can apply the mix-rule to and apply the induction Q b c c Γ For each instance r of an inference rule in hypothesis to the two components. If it is not mixed then P there must be a splitting tensor. If the splitting ∧ is already {w, w∀, ac, c∀, m, m∀, m∃, ≡} we can define the induced b c c in Γ, then we can apply the ∧-rule and proceed by induction ⌊r⌋: VJQK → VJP K which acts as the identity for ⋆ hypothesis on the two branches. However, if Γ is not mixed r ∈ {m, ≡} and as the canonical injection for r ∈ {w, w∀}. and all splitting tensors are ∧-formulas introduced in step 1 For r = ac it maps the vertices corresponding to the two of the frame construction, then we get a contradiction as in atoms in the premise to the vertex of the contracted atom that case there must be a ∨- or ∀-formula in Γ. in the conclusion, and for r ∈ {c∀, m∀, m∃} it maps the two b c c

10 vertices corresponding to the quantifiers in the premise to the an instance of the same rule, c∀ is translated to ac, and w∀ is one in the conclusion (and acts as the identity on all other translated to w. b b ◦ vertices). For a derivation Φ in {w, w∀, ac, c∀, m, m∀, m∃, ≡} A ◦ ◦ we can then define the induced map ⌊Φ⌋ asb the compositionc c This gives us a derivation {w,ac,acb ,m,≡ } Φ such that of the induced maps of the rule instances in Φ. B◦ ⌊Φ◦⌋ = ⌊Φ⌋. It has been shown in [23] that ⌊Φ◦⌋ is a skew A fibration. Hence, ⌊Φ⌋ is a skew fibration. Lemma 40. Let {w,w∀,ac,c∀,m,m∀,m∃,≡} Φ be given. Then there A B B. From Skew Bifibrations to Contraction and Weakening bb is a rectified derivation {w,w∀,ac,cb∀,m,mc∀,mc∃,≡} Φ, such that the Lemma 43. Let A and B be fographs, let ϕ: A→B be a B b skew bifibration, and let A and B be formulas with JAK = A induced maps ⌊Φ⌋: JAK → JBK and ⌊Φ⌋: JAK → JBK are b b b and JBK = B. Then there are derivations equal up to a variable renaming of the vertex labels. Proof. Immediate from the definition. A Aρϕ b {w,w∀,ac,cb∀,m,mc∀,mc∃,≡} Φ and {w,w∀,ac,c∀,m,m∀,m∃,≡} Φ A. From Contraction and Weakening to Skew Bifibrations B B A b c c such that ⌊Φ⌋ = ϕ and Φ is a rectification of Φ, and ρ is Lemma 41. Let {w,w∀,ac,c∀,m,m∀,m∃,≡} Φ be a rectified deriva- b b ϕ B the substitution induced by ϕ. tion. Then the induced map ⌊Φ⌋: JAK → JBK is a skew In the proof of this lemma, we make use of the following bifibration. P Before we show the proof of this lemma, we introduce concept: Let S Ψ be a derivation where P and Q are proposi- another useful concept: the propositional encoding A◦ of a Q formula A, which is a with the property tional formulas (possibly using variable x ∈ VAR at the places ′ that JA◦K = JAK. For this, we introduce new propositional of atoms). We say that Ψ can be lifted to S if there are (first- ◦ ◦ variables that have the same names as the (first-order) variables order) formulas C and D such that P = C and Q = D and x ∈ VAR. Then A◦ is defined inductively by: C there is a derivation S′ Ψ′. a◦ = a (∀xA)◦ = x ∨ A◦ D (A ∨ B)◦ = A◦ ∨ B◦ (∃xA)◦ = x ∧ A◦ We say a fograph homomorphism ϕ: G → H is full if for all (A ∧ B)◦ = A◦ ∧ B◦ v, w ∈ VG , we have that ϕ(v)ϕ(w) ∈ EH implies vw ∈ EG. ◦ Lemma 42. For every formula A, we have JA K = JAK. Lemma 44. Let ϕ: G → H be full and injective skew Proof. A straightforward induction on A. bifibration such that ρϕ is the identity substitution, and let G and H be formulas with JGK = G and JHK = H. Then We use ≡◦ to denote the restriction of ≡ to propositional G formulas, i.e., the first two lines in (2). there is a derivation {w,w∀,≡} Φ. Proof of Lemma 41. First, observe that for every inference H rule r ∈ {w, w∀, ac, c∀, m, m∀, m∃, ≡} the induced map Proof. By [23, 7.6.1], we have a derivation b c c ◦ ⌊r⌋: VJQK → VJP K defines an existential-preserving graph ho- G ◦ momorphism JQK → JP K and a fibration on the corresponding {w,≡ } Ψ. In order to lift Ψ, we need to reorganize the binding graphs. Therefore, their composition ⌊Φ⌋ has the same H◦ properties of fibration. instances of w. If H contains a subformula ∀x.A which is For showing that it is also a skew fibration, we construct not present in G, the w-instances in Ψ could introduce the for Φ its propositional encoding Φ◦ by translating every line parts of the propositional encoding x ∨ A independently. We into its propositional encoding. The instances of the rules m∀ say that an instance r1 of w in Φ is in the scope of an instance c and m∃ are replaced by: r2 of w if r1 introduced formulas that contain a free variable c r S{(y ∨ (Ay)◦) ∨ (z ∨ (Bz)◦)} S{(y ∧ (Ay)◦) ∨ (z ∧ (Bz)◦)} x (i.e., x occurs in a term in a predicate) and 2 introduces ≡ m the atom x as a subformula (i.e. the propositional encoding S{(y ∨ z) ∨ ((Ay)◦ ∨ (Bz)◦)} S{(y ∨ z) ∧ ((Ay)◦ ∨ (Bz)◦)} ac ac of the binder x). We can now permute the w-instances in Ψ b S{x ∨ ((Ax)◦ ∨ (Bx)◦)} b S{x ∧ ((Ax)◦ ∨ (Bx)◦)} such that whenever a rule instance r1 is in the scope of an respectively, where ac is a ac that renames the variables—the instance r2, then r2 occurs below r1 in Ψ. Then we can lift Ψ propositional variable,b as well as the first-order variable of the stepwise. First, observe that each line of Ψ is ≡◦-equivalent same name—as everything is rectified, there is no ambiguity to the propositional encoding P ◦ of a first-order formula P . here. Any instance of a rule w, ac, m, or ≡ is translated to We now have to show that each instance of w in Ψ is indeed

11 the image of a correct application of w or w∀ in first-order X. SUMMARY AND PROOF OF MAIN RESULT w logic. If we have a of the form The only theorem of Section VI that has not yet been proved S◦{A◦} S◦{A◦} is Theorem 19 establishing the full correspondence between w or w S◦{x ∨ A◦} S◦{(x ∨ B◦) ∨ A◦} decomposed proofs in KS1 and combinatorial proofs. We show the proof here, by summarizing the results of the previous two then x cannot occur freely in A, as otherwise the fibration Sections VIII and IX. property would be violated. We can therefore lift these in- stances to Proof of Theorem 19. First, assume we have a combinatorial S{A} S{A} proof ϕ: C → A and a formula A with A = JAK. Let C be a w∀ or w S{∀x.A} S{(∀x.B) ∨ A} formula with JCK = C, and let ρϕ be the substitution induced by ϕ. By Lemma 43 there is a derivation respectively. If a weakening happens inside a subformula x ∨ ◦ ◦ C or x ∧ C in Ψ, then there are the following cases: Cρϕ S◦{x ∨ C◦} S◦{x ∧ C◦} S◦{x ∧ C◦} {w,w∀,ac,c∀,m,m∀,m∃,≡} Φ2 w w w S◦{x ∨ D◦ ∨ C◦} S◦{x ∧ (D◦ ∨ C◦)} S◦{(x ∨ D◦) ∧ C◦} A The first two cases can be lifted to Since C is a fonet, we have by Theorem 38 a derivation S{∀x.C} S{∃x.C} t w and w X ′ S{∀x.(D ∨ C)} S{∃x.(D ∨ C)} MLS1 Φ1 respectively. But in the third case, an ∃-quantifier would be C transformed into an ∀-quantifier. But as ϕ has to preserve This derivation remains valid if we apply the substitution ρϕ to existentials, this third case cannot occur. All other situations ′ G every line in Φ1, yielding the derivation Φ1 of Cρϕ as desired. Conversely, assume we have a decomposed derivation can be lifted trivially, giving us {w,w∀,≡} Φ as desired. H t X Lemma 45. Let ϕ: G → H be a surjective skew bifibration, MLS1 Φ1 and let G and H be formulas with JGK = G and JHK = H. A′ (6)

Then there is a derivation {w,w∀,ac,c∀,m,m∀,m∃,≡} Φ2

Gρϕ A

{ac,c∀,m,m∀,m∃,≡} Φ Then we can transform Φ1 into a rectified form Φ1, proving H b A′. By Theorem 33, the linked fograph JΦ1K = hJA′K, ∼b i b b b Φ1 where ρϕ is the substitution induced by ϕ. is a fonet. Then, by Lemma 40, there is a rectified derivation Proof. By [47, Proposition 7.5], there is a derivation A′ ◦ cc ′ (Gσϕ) {w,w∀,ac,cb∀,m,mc∀,mc∃,≡} Φ2 whose induced map ⌊Φ2⌋: JA K → ◦ c c {ac,m,≡ } Ψ . We can lift Ψ to a first-order derivation in A ◦ b H JAK is the same as the induced map ⌊Φ2⌋: JA′K → JAK of ac c m m m b { , ∀, , ∀, ∃, ≡}, in a similar way as in the previous Φ2. By Lemma 41, this map is a skew bifibration. Hence, we lemma. The technical details are in Appendix C. have a combinatorial proof ϕ: C → JAK with C = JA′K. c ′ Proof of Lemma 43. Let VB ⊆ VB be the image of ϕ, and let Note that Theorem 19 shows at the same time soundness, ′ B1 be the subgraph of B induced by VB. Hence, we have two completeness, and full completeness, as maps ϕ′′ : A→B1 being a surjection and ϕ′ : B1 →B being 1) everyproof in KS1 can be translated into a combinatorial a full injection. Both, ϕ′ and ϕ′′ remain skew bifibrations. proof, and Furthermore, B1 is also a fograph. Let B1 be a formula with 2) every combinatorial proof is the image of a KS1-proof JB1K = B1. We can apply Lemmas 44 and 45 to obtain under that translation. derivations B1 Aρϕ′′ XI. CONCLUSION w w Φ′ ac c m m m Φ′′ { , ∀,≡} and { , ∀, , ∀, ∃,≡} We uncovered a close correspondence between first-order B B1 combinatorial proofs and decomposed deep inference deriva- As ρϕ′ is the identity, we have ρϕ′′ = ρϕ. Hence, the tions of system KS1, and showed that every proof in KS1 has composition of Φ′′ and Φ′ is the desired derivation Φ. Then such a decomposed form. Φ can be constructed by rectifying Φ, where the variables The most surprising discovery for us was that all technical b to be used in A are already given. That ϕ = ⌊Φ⌋ follows difficulties in our work could be reduced (in a non-trivial way) b immediately from the construction. to the propositional setting.

12 The obvious next step in our research is to investigate proof [21] J. Herbrand, “Recherches sur la Th´eorie de la D´emonstration,” Ph.D. composition and normalisation of first-order combinatorial dissertation, University of Paris, 1930. [22] K. Br¨unnler, “Cut Elimination Inside a Deep Inference System for proofs. Even in the propositional setting, the normalisation Classical Predicate Logic,” Studia Logica, vol. 82, no. 1, pp. 51–71, of combinatorial proofs is underdeveloped. There exist two 2006. different procedures for cut elimination for combinatorial [23] L. Straßburger, “A Characterization of Medial as Rewriting Rule,” in International Conference on Rewriting Techniques and Applications. proofs in classical propositional logic [10], [12], but both Springer, 2007, pp. 344–358. have their insufficiencies, and have not been extended to other [24] A. S. Troelstra and H. Schwichtenberg, Basic . Cambridge logics. University Press, 2000, no. 43. [25] J.-Y. Girard, “Quantifiers in Linear Logic,” Temi e prospettive della We hope to garner new insights on the normalisation of logica e della filosofia della scienza comtemporanee, vol. 1, pp. 95– classical first-order proofs through our work on combinatorial 130, 1988. proofs. [26] A. Fleury and C. Retor´e, “The Mix Rule,” Math. Structures in Comp. Science, vol. 4, no. 2, pp. 273–285, 1994. [27] G. Bellin, “Subnets of Proof-nets in Multiplicative Linear Logic with REFERENCES MIX,” Mathematical Structures in Computer Science, vol. 7, no. 6, pp. 663–699, 1997. [1] G. Frege, Begriffsschrift. Louis Nebert, Halle, 1879, English Translation [28] H. Lerchs, “On cliques and kernels,” 1971, Tech. report, U. Toronto. in: J. van Heijenoort (ed.), From Frege to G¨odel, Harvard University [29] D. G. Corneil, H. Lerchs, and L. K. Stewart-Burlingham, “Complement Press: 1977. reducible graphs,” Disc. Appl. Math., 1981. [2] D. Hilbert, “Die logischen Grundlagen der Mathematik,” Mathematische [30] A. Grothendieck, “Technique de descente et th´eor`emes d’existence Annalen, vol. 88, pp. 151–165, 1922. en g´eom´etrie alg´ebrique. I. G´en´eralit´es. Descente par morphismes [3] G. Gentzen, “Untersuchungen ¨uber das logische Schließen. I.” Mathe- fid`element plats,” in S´eminaire Bourbaki: ann´ees 1958/59–1959/60, matische Zeitschrift, vol. 39, pp. 176–210, 1935. expos´es 169-204. Soci´et´emath´ematique de France, 1960. [4] G. Gentzen, “Untersuchungen ¨uber das logische Schließen. II.” Mathe- [31] J. W. Gray, “Fibred and cofibred categories,” in Proc. Conf. on Cate- matische Zeitschrift, vol. 39, pp. 405–431, 1935. gorical Algebra ’65. Springer, 1966, pp. 21–83. [5] R. M. Smullyan, First-Order Logic. Berlin: Springer-Verlag, 1968. [32] K. Br¨unnler, “Deep Inference and Symmetry for Classical Proofs,” Ph.D. [6] J. A. Robinson, “A Machine-Oriented Logic Based on the Resolution dissertation, Technische Universit¨at Dresden, 2003. Principle,” Journal of the ACM, vol. 12, pp. 23–41, 1965. [33] B. Ralph, “Modular Normalisation of Classical Proofs,” Ph.D. disserta- [7] R. Thiele, “Hilbert’s Twenty-fourth Problem,” American Mathematical tion, University of Bath, 2019. Monthly, vol. 110, pp. 1–24, 2003. [34] A. A. Tubella and A. Guglielmi, “Subatomic Proof Systems: Splittable [8] D. Hilbert, “Mathematische Probleme,” Nachrichten der K¨oniglichen Systems,” ACM Transactions on Computational Logic (TOCL), vol. 19, Gesellschaft der Wissenschaften zu G¨ottingen, mathematisch- no. 1, pp. 1–33, 2018. physikalische Klasse, vol. 3, pp. 253–297, 1900. [35] K. Br¨unnler, “Locality for ,” Notre Dame Journal of [9] D. Hughes, “Proofs Without Syntax,” Annals of Mathematics, vol. 164, Formal Logic, vol. 47, no. 4, pp. 557–580, 2006. [Online]. Available: no. 3, pp. 1065–1076, 2006. http://www.iam.unibe.ch/∼kai/Papers/LocalityClassical.pdf [10] D. Hughes, “Towards Hilbert’s 24th Problem: Combinatorial Proof [36] J.-Y. Girard, “Proof-nets: The Parallel Syntax for Proof-theory,” in Logic Invariants:(preliminary version),” Electronic Notes in Theoretical Com- and Algebra, A. Ursini and P. Agliano, Eds. Marcel Dekker, New York, puter Science, vol. 165, pp. 37–63, 2006. 1996. [11] L. Straßburger, “The Problem of Proof Identity, and Why Computer [37] A. Guglielmi and L. Straßburger, “Non-commutativity and MELL in Scientists Should Care About Hilbert’s 24th Problem,” Philosophical The Calculus of Structures,” in Computer Science Logic, CSL 2001, ser. Transactions of the Royal Society A, vol. 377, no. 2140, p. 20180038, LNCS, L. Fribourg, Ed., vol. 2142. Springer-Verlag, 2001, pp. 54–68. 2019. [38] L. Straßburger, “Linear logic and Noncommutativity in the Calculus of [12] L. Straßburger, “Combinatorial Flows and Their Normalisation,” in Structures,” Ph.D. dissertation, Technische Universit¨at Dresden, 2003. 2nd International Conference on Formal Structures for Computation [39] D. Hughes, “Unification Nets: Canonical Proof Net Quantifiers,” in and Deduction (FSCD 2017). Schloss Dagstuhl-Leibniz-Zentrum fuer Proceedings of the 33rd Annual ACM/IEEE Symposium on Logic in Informatik, 2017. Computer Science, 2018, pp. 540–549. [13] L. Straßburger, “Combinatorial Flows and ,” [40] A. A. Tubella and L. Straßburger, “Introduction to Deep Inria Saclay, Research Report RR-9048, 2017. [Online]. Available: Inference,” 2019, lecture notes for ESSLLI’19. [Online]. Available: https://hal.inria.fr/hal-01498468 https://hal.inria.fr/hal-02390267 [14] M. Acclavio and L. Straßburger, “From Syntactic Proofs to Combina- [41] K. Br¨unnler and A. F. Tiu, “A Local System for Classical Logic,” in In- torial Proofs,” in Automated Reasoning - 9th International Joint Con- ternational Conference on Logic for Programming Artificial Intelligence ference, IJCAR 2018, Held as Part of the Federated Logic Conference, and Reasoning. Springer, 2001, pp. 347–361. FloC 2018, Oxford, UK, July 14-17, 2018, Proceedings, D. Galmiche, [42] C. Retor´e, “Pomset Logic as a Calculus of Directed Cographs,” S. Schulz, and R. Sebastiani, Eds., vol. 10900. Springer, 2018, pp. INRIA, Research Report RR-3714, 1999. [Online]. Available: 481–497. https://hal.inria.fr/inria-00072953 [15] M. Acclavio and L. Straßburger, “On Combinatorial Proofs for Log- [43] V. Danos and L. Regnier, “The Structure of Multiplicatives,” Archive ics of Relevance and Entailment,” in 26th Workshop on Logic, Lan- for , vol. 28, no. 3, pp. 181–203, 1989. guage, Information and Computation (WoLLIC 2019), R. Iemhoff and [44] C. Retor´e, “Handsome Proof-nets: Perfect Matchings and Cographs,” M. Moortgat, Eds. Springer, 2019. Theoretical Computer Science, vol. 294, no. 3, pp. 473–488, 2003. [16] M. Acclavio and L. Straßburger, “On Combinatorial Proofs for Modal [45] C. Retor´e, “Handsome Proof-nets: R&B-Graphs, Perfect Matchings Logic,” in International Conference on Automated Reasoning with and Series-parallel Graphs,” INRIA, Research Report RR-3652, 1999. Analytic Tableaux and Related Methods. Springer, 2019, pp. 223–240. [Online]. Available: https://hal.inria.fr/inria-00073020 [17] W. Heijltjes, D. Hughes, and L. Straßburger, “Intuitionistic Proofs [46] L. Straßburger, “Deep Inference and Expansion Trees for Second-order Without Syntax,” in 2019 34th Annual ACM/IEEE Symposium on Logic Multiplicative Linear Logic,” Mathematical Structures in Computer in Computer Science (LICS). IEEE, 2019, pp. 1–13. Science, vol. 29, pp. 1030–1060, 2019. [18] D. Hughes, “First-order Proofs Without Syntax,” arXiv preprint [47] B. Ralph and L. Straßburger, “Towards a Combinatorial Proof Theory,” arXiv:1906.11236, 2019. in International Conference on Automated Reasoning with Analytic [19] J.-Y. Girard, “Linear Logic,” Theoretical Computer Science, vol. 50, pp. Tableaux and Related Methods. Springer, 2019, pp. 259–276. 1–102, 1987. [48] F. Lamarche, “Exploring the Gap Between Linear and Classical Logic,” [20] S. Abramsky and R. Jagadeesan, “Games and Full Completeness for Theory and Applications of Categories, vol. 18, no. 18, pp. 473–535, Multiplicative Linear Logic,” Journal of Symbolic Logic, vol. 59, no. 2, 2007. pp. 543–574, 1994.

13 ′ APPENDIX By transforming each sequent of Φ2 into its corresponding formula, and by considering the following rewriting rule: A. Proof of Theorem 29 ⊢ Γ ⊢ (Γ) Proof of Theorem 29. Write fv(A) for the set of variables wk w W which occur free in A. ⊢ Γ, A ⊢ W(Γ) ∨ A Note that the instances of w, c in Φ2 are deep, but inside , we obtain a derivation sequent contexts.

⊢ Γ X First, if an instance of wk is followed by a rule in MLL1 Φ1 ⊢ Γ, A ′ which A is not in the principal formula, it can be permuted ⊢ W(Γ ) downwards. Otherwise, the proof can be transformed using the {w,c,≡} Φ2 following rewriting rules. ⊢ W(Γ) where Φ1 can be obtained from Φ′ by applying the ∨ rule. ⊢ Γ 1 ⊢ Γ wk wk . B. Rule permutation for the proof of Lemma 31 ⊢ Γ, A ⊢ B, ∆ . ∧ wk We construct a rewriting system based on rule permutation ⊢ Γ, A ∧ B, ∆ ⊢ Γ, A ∧ B, ∆ on derivations in {w, w∀, ac, c∀, m, m∀, m∃, ≡} that allows us to reach a derivation of the form ⊢ Γ, A wk ⊢ Γ, A A ⊢ Γ,A,B w ∨ ⊢ Γ, A ∨ B {m,m∀,m∃,≡} ⊢ Γ, A ∨ B A′ ac c ⊢ Γ { , ∀} wk ⊢ Γ B′ ⊢ Γ, A[x/t] wk ∃ ⊢ Γ, ∃x.A {w,w∀,≡} ⊢ Γ, ∃x.A B ⊢ Γ from any derivation. Intuitively, we want to move all the wk ⊢ Γ r w w wk instances of ∈ { , ∀} downwards and all the instances ⊢ Γ, A ′ ∀ ⊢ Γ, ∀x.A of r ∈{m, m∀, m∃} upwards. ⊢ Γ, ∀x.A We first study the interactions between two rules. Certain cases are unsolved at this stage, and they are considered ⊢ Γ, A wk later when we study the interactions between two non-≡ rule ⊢ Γ, A, A ⊢ Γ, A ctr instances separated by ≡. Only non-trivial cases are presented ⊢ Γ, A here: • r1/r2, where r1 ∈{w, w } and r2 ∈{ac, c , m, m , m }: Note that in the case of ∨, we use the deep rule w which ∀ ∀ ∀ ∃ can be permuted under all the rules. By using these rewriting a w rules, we can eventually get a derivation with all the instances a ∨ a a of wk and w at the bottom. Now observe that the instances of ac ctr in Φ can be transformed using the following rule: a A ∧ C A ∧ C ⊢ Γ, A, A w w ⊢ Γ, A, A ∨ (A ∧ C) ∨ (B ∧ D) (A ∨ B) ∧ C ctr ⊢ Γ, A ∨ A m w ⊢ Γ, A c ⊢ Γ, A (A ∨ B) ∧ (C ∨ D) (A ∨ B) ∧ (C ∨ D) ∀x.A Knowing that c can be permuted under all the rules of w ∀x.A MLL1X, we eventually obtain a derivation: (∀x.A) ∨ (∀x.B) w m∀ ∀x.(A ∨ B) ∀x.(A ∨ B)

MLL1X Φ′ ∀x.A 1 w ∀ ⊢ Γ′ ∀x.∀x.A ∀x.A c∀ ′ {wk,w,c,≡} Φ2 ∀x.A ⊢ Γ A ∨ (∀x.B) w∀ A ∨ (∀x.B) Note that ≡ is required here since the permutation of formulas (∀x.A) ∨ (∀x.B) ≡ X m∀ ∀x.(A ∨ B) is implicit in MLL1 . ∀x.(A ∨ B)

14 where in the last case, x is not free in A. • r1/r2, where r1 ∈{ac, c∀} and r2 ∈{m, m∀, m∃}: ∀y.A w∀ (x∈ / fv(A)) ∀y.A S{(∀x.∀x.A) ∨ (∀x.B)} ∀y.∀x.A w∀ (x∈ / fv(∀y.A)) S{(∀x.∀x.A) ∨ (∀x.B)} m∀ ≡ ∀x.∀y.A c∀ S{∀x.(∀x.A ∨ B)} ∀x.∀y.A S{(∀x.A) ∨ (∀x.B)} ≡ m S{(∀x.A) ∨ (∀x.B)} ∀ m A ∨ B S{∀x.(A ∨ B)} ∀ w S{∀x.(A ∨ B)} ∀ A ∨ B ∀x.(A ∨ B) w∀ c ≡ (∀x.A) ∨ B • ∀/ ≡: (∀x.A) ∨ B ∀x.∀x.∀y.A ∀x.∀x.∀y.A A ∨ B c w ∀ ≡ ∀ A ∨ B ∀x.∀y.A ∀y.∀x.∀x.A (∀x.A) ∨ B w∀ ≡ c∀ ≡ ∀x.(A ∨ B) ∀y.∀x.A ∀y.∀x.A ∀x.(A ∨ B) ∀x.∀x.(A ∨ B) ∀x.∀x.(A ∨ B) where in the last two cases, the constraint on x on the c∀ ≡ ∀x.(A ∨ B) (∀x.∀x.A) ∨ B left-hand side implies that of the right-hand side. ≡ c∀ • ≡ /c∀: (∀x.A) ∨ B (∀x.A) ∨ B ∀x.∀y.∀x.A (∀x.∀x.A) ∨ B (∀x.∀x.A) ∨ B c ≡ ∀ ≡ ∀x.∀x.∀y.A (∀x.A) ∨ B ∀x.∀x.(A ∨ B) c c ∀ ≡ ∀ ∀x.∀y.A ∀x.(A ∨ B) ∀x.(A ∨ B) ∀x.∀y.∀x.A where in the last two cases, x is not free in B. ≡ • w/ ≡: ∀y.∀x.∀x.A c∀ ∀y.∀x.A A w ∀x.((∀x.A) ∨ B) A ∨ B fv ≡ ≡ (x∈ / (B)) B ∨ A (∀x.∀x.A) ∨ B c∀ (∀x.A) ∨ B A ∨ C w ∀x.((∀x.A) ∨ B) (A ∨ B) ∨ C) fv ≡ ≡ (x∈ / (B)) A ∨ (B ∨ C) ∀x.∀x.(A ∨ B) c∀ ∀x.(A ∨ B) ∀x.A w ∀x.A m ∀x.(A ∨ B) w • ≡ / : ≡ (∀x.A) ∨ B (∀x.A) ∨ B (C ∧ A) ∨ (B ∧ D) ≡ ∀x.B (A ∧ C) ∨ (B ∧ D) w m ∀x.(B ∨ A) (A ∨ B) ∧ (C ∨ D) ≡ (∀x.A) ∨ B (B ∧ D) ∨ (A ∧ C) (B ∧ D) ∨ (A ∧ C) ≡ m ∀x.A (A ∧ C) ∨ (B ∧ D) (B ∨ A) ∧ (D ∨ C) w ∀x.A m (∀x.A) ∨ B w ≡ ≡ ∀x.(A ∨ B) (A ∨ B) ∧ (C ∨ D) (A ∨ B) ∧ (C ∨ D) ∀x.(A ∨ B) ((A ∧ C) ∧ E) ∨ (B ∧ D) ≡ B (A ∧ (C ∧ E)) ∨ (B ∧ D) w m B ∨ (∀x.A) ≡ (A ∨ B) ∧ ((C ∧ E) ∨ D) ∀x.(A ∨ B) (∀x.(A ∧ C)) ∨ (B ∧ D) where in the last four cases, x is not free in B. ≡ (x∈ / fv(B ∧ D)) ∀x.((A ∧ C) ∨ (B ∧ D)) • w∀/ ≡: m In the following two cases, we assume x 6= y (otherwise ∀x.((A ∨ B) ∧ (C ∨ D))

they are trivial). • ≡ /m∀:

∀y.A (∀x.B) ∨ (∀x.A) (∀x.B) ∨ (∀x.A) w fv ≡ m ∀ (x∈ / (∀y.A)) ∀y.A ∀ ∀x.∀y.A w∀ (x∈ / fv(A)) (∀x.A) ∨ (∀x.B) ∀x.(B ∨ A) ≡ ∀y.∀x.A m∀ ≡ ∀y.∀x.A ∀x.(A ∨ B) ∀x.(A ∨ B)

15 (∀y.∀x.A) ∨ (∀x.B) ≡ a (∀x.∀y.A) ∨ (∀x.B) w m∀ a ∨ (∀x.a) a ∀x.((∀y.A) ∨ B) ≡ (x∈ / fv(a)) w∀ (x∈ / fv(a)) ∀x.(a ∨ a) ∀x.a ac ∀x.(A ∨ (∀x.B)) ≡ ∀x.a (∀x.A) ∨ (∀x.B) w m m∀ • / ≡ / : ∀x.(A ∨ B) C ∧ A C ∧ A • ≡ /m : similar to ≡ /m w ≡ ∃ ∀ (C ∧ A) ∨ (B ∧ D) A ∧ C Interactions between two non-≡ rules with the presence of ≡ w (A ∧ C) ∨ (B ∧ D) (A ∨ B) ∧ C ≡ in between: m w • c∀/ ≡ /r where r ∈{m, m∀, m∃}: First permute c∀ under (A ∨ B) ∧ (C ∨ D) (A ∨ B) ∧ (C ∨ D) ≡ and then permute c∀ under r. • ac/ ≡ /r where r ∈{m, m , m }: First permute ac under B ∧ D ∀ ∃ B ∧ D ∀ ≡ and then permute ac under r. w ∀x.(B ∧ D) (B ∧ D) ∨ (∀x.(A ∧ C)) w • w/ ≡ /c∀: ≡ ∀x.((B ∨ A) ∧ D) ∀x.((A ∧ C) ∨ (B ∧ D)) w ∀x.∀x.A m ∀x.((B ∨ A) ∧ (D ∨ C)) w ∀x.((A ∨ B) ∧ (C ∨ D)) ≡ ∀x.∀x.A ∀x.((A ∨ B) ∧ (C ∨ D)) ∀x.((∀x.A) ∨ B) c∀ ≡ (∀x.A) ∨ B (∀x.∀x.A) ∨ B w where in the second case, x is free in B ∧ D. c∀ (∀x.A) ∨ B • w/ ≡ /m : (∀x.A) ∨ B ∀ ∀x.B ∀x.B w ∀x.B w ∀x.B (∀x.B) ∨ (∀x.A) w ∀x.(B ∨ (∀x.A)) w ≡ ∀x.(B ∨ A) ≡ ∀x.(B ∨ A) (∀x.A) ∨ (∀x.B) ≡ (∀x.∀x.A) ∨ B ≡ m∀ ∀x.(A ∨ B) c∀ (∀x.A) ∨ B ∀x.(A ∨ B) (∀x.A) ∨ B ∀x.∀x.A ∀x.∀x.A w w ∀x.∀x.A ∀x.∀x.A ∀x.((∀x.A) ∨ B) c∀ ∀x.((∀x.A) ∨ B) c∀ ≡ ∀x.A ≡ ∀x.A w w (∀x.A) ∨ (∀x.B) ∀x.∀x.(A ∨ B) m∀ ∀x.(A ∨ B) c∀ ∀x.(A ∨ B) ∀x.(A ∨ B) ∀x.(A ∨ B) • w/ ≡ /m : ∀x.B ∃ w ∀x.B ∀x.(B ∨ (∀x.A)) w ∃x.B ≡ ∀x.(B ∨ A) w ∃x.B (∃x.B) ∨ (∃x.A) w ∀x.∀x.(A ∨ B) ≡ c∀ ∀x.(A ∨ B) ≡ ∃x.(B ∨ A) ∀x.(A ∨ B) (∃x.A) ∨ (∃x.B) ≡ m∃ ∃x.(A ∨ B) where in all four cases, x is not free in B. ∃x.(A ∨ B) • w/ ≡ /ac: C. Proof of Lemma 45 a ∨ B w Proof of Lemma 45. By [47, Proposition 7.5], there is a (a ∨ B) ∨ a ◦ (Gρϕ) ≡ a ∨ B ◦ (a ∨ a) ∨ B derivation {ac,m,≡ } Ψ , We plan to show that Ψ can be lifted ac ◦ a ∨ B H to {ac, c∀, m, m∀, m∃, ≡}. However, observe that not every a w formula occurring in Ψ is a propositional encoding. There are a ∨ (a ∨ B) a two reasons for this: (i) we might have P ≡◦ Q where P ≡ w ac (a ∨ a) ∨ B a ∨ B is a propositional encoding but Q is not, and (ii) the rule ac can duplicate an atom x ∈ VAR. Let us write acx for such a ∨ B instances. The problem with (i) is that we could have the ∀x.a following situation w (∀x.a) ∨ a S{(x ∧ (E ∧ C)) ∨ (x ∧ (F ∧ D))} ≡ (x∈ / fv(a)) ∀x.a ≡◦ ∀x.(a ∨ a) S{((x ∧ E) ∧ C) ∨ ((x ∧ F ) ∧ D)} (7) ac m ∀x.a S{((x ∧ E) ∨ (x ∧ F )) ∧ (C ∨ D)}

16 where x occurs in C ∨ D. Then premise and conclusion are – E◦ is empty and F ◦ is non-empty: We have both propositional encodings, but the whole derivation cannot S◦{x ∨ (x ∨ F ◦)} be lifted. However, since we demand that the mapping is ac≡ x ◦ ◦ a fibration (and therefore a homomorphism) on the binding S {x ∨ F )} graphs, there must be another instance of m further below in which can be lifted to the derivation: ′ S{∀x.∀x.F } S {(x ∧ E) ∨ (x ∧ F )} c m (8) ∀ S′{(x ∨ x) ∧ (E ∨ F )} S{∀x.F } ◦ ◦ We can permute both instances via the following more general – E is non-empty and F is empty: This is similar scheme (see [23], [48] for a general discussion on permuta- to the previous case. ◦ ◦ tions of the m-rule): – E and F are both empty: This is impossible as the premise would not be a propositional encoding. S{(G ∧ E ∧ C) ∨ (G ∧ F ∧ D)} S{(G ∧ E ∧ C) ∨ (G ∧ F ∧ D)} m m ac S{((G ∧ E) ∨ (H ∧ F )) ∧ (C ∨ D)} ↔ S{(G ∨ H) ∧ ((E ∧ C) ∨ (F ∧ D))} • (contracting an ordinary atom): This can trivially be m m (9) S{(G ∨ H) ∧ (E ∨ F ) ∧ (C ∨ D)} S{(G ∨ H) ∧ (E ∨ F ) ∧ (C ∨ D)} lifted. m We omitted some instances of ≡◦ and some parentheses. We • : There are several cases to consider. now call instances of m as in (7) illegal, and we can transform – If none of the four principal formulas in the premise Ψ through m-permutations (9) into a derivation that does not is x or x∨F for some formula F and x ∈ VAR, then contain any illegal m-instances. To address (ii), we also apply a this instance of m can trivially be lifted, and we can proceed by induction hypothesis. permutation argument, permuting all instances of acx up until they either reach the top of the derivation or an instance of m – If exactly one of the four principal formulas in the which separates the two atoms in the premise. More precisely, premise is x for some x ∈ VAR, then this x is the we consider the following inference rule encoding of an existential in the premise and of an universal in the conclusion. This is impossible, as ϕ S0{S1{x} ∨ S2{x}} ac≡ has to preserve existentials. x (10) S{x} – If two of the four principal formulas in the premise are x for some x ∈ VAR, then we are in the following where S1{·} ≡ {·} ∨ E and S2{·} ≡ {·} ∨ F and S{·} ≡ special case of (11): S0{{·} ∨ E ∨ F } for some formulas E and F , where E or F ac≡ ac or both might be empty. The rule x permutes over ≡, , S{(x ∧ C) ∨ (x ∧ D)} ac≡ m m and other instances of x , and over instances of if they S{(x ∨ x) ∧ (C ∨ D)} ac≡ ac≡ occur inside S0 or S1 or S2. The only situation in which x x cannot be permuted up is the following: S{x ∧ (C ∨ D)} which can be lifted immediately to S{(R1{x} ∧ C) ∨ (R2{x} ∧ D)} m S{(∃x.C) ∨ (∃x.D)} S{(R1{x} ∨ R2{x}) ∧ (C ∨ D)} (11) m ac≡ ∃ x S{∃x.(C ∨ D)} S{R{x} ∧ (C ∨ D)} – We have a situation (11) where R1{x} ≡ x ∨ E We can therefore assume that all instances of acx, that contract for some E and R2{x} ≡ x ∨ F for some F with an atom x ∈ VAR are either at the top of Ψ or below a m- (Otherwise, the application of instance as in (11). We now lift Ψ to {ac, c , m, m , m , ≡}, R{x} ≡ x ∨ E ∨ F ∀ ∀ ∃ ac≡ would not be correct.) That means, we have: proceed by induction on the height of Ψ, beginning at the top, x making a case analysis on the topmost rule that is not a ≡. S{((x ∨ E) ∧ C) ∨ ((x ∨ F ) ∧ D)} m • acx: We know that the premise of (10) is a propositional S{((x ∨ E) ∨ (x ∨ F )) ∧ (C ∨ D)} ◦ ac≡ encoding. Hence, S1{·} = {·} ∨ E and S2{·} = {·} ∨ x S{(x ∨ E ∨ F ) ∧ (C ∨ D)} F ◦ and both x are universals, and E◦ ∨ F ◦ contains all occurrences of x bound by that universal. We have the which can be lifted to S{((∀x.E) ∧ C) ∨ ((∀x.F ) ∧ D)} following subcases: m – E and F are both non-empty: We have S{((∀x.E) ∨ (∀x.F )) ∧ (C ∨ D)} m∀ S◦{(x ∨ E◦) ∨ (x ∨ F ◦)} S{(∀x.(E ∨ F )) ∧ (C ∨ D)} ac≡ x S◦{x ∨ (E◦ ∨ F ◦)} – In all other cases (e.g. exactly one of the principal formulas is of shape (and none is ), we can which can be lifted to x ∨ F x trivially lift the m-instance, as the quantifier structure S{(∀x.E) ∨ (∀x.F )} is not affected. m∀ S{∀x.(E ∨ F )} Gρϕ

Thus can be lifted to {ac,c∀,m,m∀,m∃,≡} Φ. where S◦{·}, E◦, F ◦ are the propositional encodings Ψ of S{·},E,F , respectively. H

17