Interpreting Arguments

Total Page:16

File Type:pdf, Size:1020Kb

Interpreting Arguments Informal Logic IX.1, Winter 1987 Interpreting Arguments JONATHAN BERG University of Haifa We often speak of "the argument in" Two notions which would merit ex­ a particular piece of discourse. Although tensive analysis in a broader context will such talk may not be essential to only get a bit of clarification here. By philosophy in the philosophical sense, 'claims' I mean what are variously call­ it is surely a practical requirement for ed 'propositions', 'statements', or 'con­ most work in philosophy, whether in the tents'. I remain neutral with regard to interpretation of classical texts or in the their ontological status and shall not pre­ discussion of contemporary work, and tend to know exactly how to individuate it arises as well in everyday contexts them, especially in relation to sentences. wherever we care about reasons given I shall assume that for every claim there in support of a claim. (I assume this is is at least one literal, declarative why developing the ability to identify sentence typically used for making that arguments in discourse is a major goal claim, and often I shall not distinguish in many courses in informal logic or between the claim, proper, and such a critical thinking-the major goal when I sentence. teach the subject.) Thus arises the ques­ What I mean when I say that one tion, just how do we get from text to claim follows from other claims is much argument? harder to specify than what I do not I shall propose a set of principles for mean. I certainly do not have in mind the extrication of argu ments from any mere psychological or causal con­ argumentative prose. In so doing, I aim nection. Nor do I wish to restrict myself to provide both a (meta) philosophical to deductive validity. I shall rely for now theory about a central part of the on a very rough normative formulation: philosophical routine and also some one claim follows from another (or practical guidelines suitable for students others) if one should not accept the lat­ of argument. First I shall present the ter without accepting the former. And an theoretical framework underlying my argument is valid (as opposed to deduc­ remarks, and then, after an exposition of tively valid, in the usual sense) if its con­ the principles, I shall show how they clusion does, indeed, follow from its apply to some questions about implicit premises.1 premises and about ignoratio elenchi. 2. Arguments and Intentions 1. Theoretical Preliminaries A more noteworthy feature of argu­ I take an argument or inference to be ments as conceived here is their inten­ a collection of claims, one of which, the tionality. An argument is not merely a conclusion, is put forth as following collection of claims, nor even a collec­ from the others, the premises. While not tion of claims bearing a certain logical distinguishing strictly between argu­ relation to each other, but rather, a col­ ments and inferences, I think of in­ lection of claims intended, by an arguer, ferences as atomic arguments, in which to bear a certain logical relation to each none of the premises double as conclu­ other.2 This is not to say that we can sions inferred from any of the other speak of an argument only relative to premises. some actual arguer, for we can always 14 Jonathan Berg imagine, for the sake of discussion, a arguer's intentions, which otherwise hypothetical arguer with the requisite in­ tend to be neglected as attention centers tentions (much as we can discuss a on strict logical validity. As I shall try to speech act, such as the assertion that p, show, giving the arguer's intentions their without regard to any actual speaker). due can shed some light on certain But whether the arguer be actual or issues in the theory of argument inter­ hypothetical, the structure of an argu­ pretation. Secondly, general principles ment (as well as the content) is largely of interpretation cannot be applied to determined by the arguer's intentions. argumentative discourse without special Consequently, extracting arguments attenti.on to validity, due to its centrality from their textual surroundings is a mat­ in the enterprise of argument. Above all, ter of discerning intentions. And how the principles I present here constitute those intentions are to be discerned a reasoned, unified basis for argument depends, of course, on our intentions as interpretation, applying systematically to interpreters. In philosophical discussion all aspects of the interpretive process­ the point of recounting an argument is as opposed to, for instance, ad hoc rules to enable its evaluation, with regard to for determining implicit premises. both its validity and also the acceptability of its premises-or at least I shall assume this as a working hypothesis. That is, I 3. Principles of Interpretation take it that I am not unusual among philosophers in intending when The following principles are put forth elucidating an argument to facilitate as guidelines for presenting the argu­ judgment of its soundness.3 Of course, ment contained in a given text, where I may have the ulterior motive of wan­ the argument is to be presented by giv­ ting to impart a particular attitude ing sentences corresponding to its toward the argument, but I must never­ claims and by indicating, via these theless at least make out to be presen­ sentences, which claims allegedly follow ting the argument for the sake of objec­ from which. Such a presentation often tive evaluation (as I would want the at­ takes the form of a numbered list of titude I favor to be seen as the inevitable sentences along with a tree diagram of result of fair and fitting consideration). the sentence numbers which indicates Thus construed, the interpretation of the lines of inference, but any method argumentative discourse is like the inter­ of presenting a set of sentences under pretation, for the sake of critical review, a partial order will do. of any expository prose, in that in both I view these principles as mere cases we aim to determine on the basis guidelines, because they are not hard of the text (and its context) what the and fast rules; indeed, they are largely author meant, so that a decision may overdeterminate. Hence, all except the then be reached about its correctness. last should be understood as implicitly It should come as no surprise, then, that qualified by a certain paribus operator. the principles of interpretation I for­ Furthermore, they are by no means mulate apply not only to argumentative meant to provide a complete method for discourse, but to any discourse being in­ extricating arguments. Rather, they are terpreted in preparation for the assess­ intended to help in the application of ment of its descriptive content. more basic strategies for identifying the The application of these general prin­ elements of an argument's structure, by ciples to argu mentative discou rse is contributing to the resolution of ques­ especially valuable for several reasons. tions raised by these more fundamental First of all, when the extrication of strategies about how a particular claim arguments is treated as a kind of inter­ should be formulated and how it func­ pretation, more attention is paid to the tions in the argument.4 Interpreting Arguments 15 (a) The Principle of Loyalty: Be loyal to the defenceless fetus must bear full responsibili­ text (written or spoken-in formulating ty for his wicked crime. claims prefer the wording of the text, and in general prefer interpretations supported with by the strongest, most direct textual evidence (including what is known of the cir­ An individual who performs an abortion cumstances under which the text was must bear full responsibility for his action. produced). (d) The Principle of Charity: Assume the The text, after all, is the entire arguer is not grossly deficient in either manifestation of the argument; it is, in general knowledge or logical competence, a sense, all the interpreter has to go on. i.e., that he would not make wildly absurd claims or inferences. Since the text completely exhausts the arguer's explicit commitments, deviation This follows more or less from the from it increases the interpreter's assumption that the argument in ques­ vulnerability to charges of misinterpreta­ tion is worth considering, which should tion. And the interpretation is fatally be assumed at least for the sake of flawed to the extent that the arguer can discussion.7 This principle is much more say it is not what he meant.5 This applies modest than many of its popular not only to the formulation of the argu­ namesakes, for it is aimed at pulling out ment's claims, but also to the determina­ of the text the arguer's argument, which tion of the lines of inference. The word is not always the best argument that can 'therefore' is strong, direct textual be read into the text. 8 evidence in support of taking the claim (e) The Principle of Principled Preference: immediately after it as allegedly follow­ The preferability of the favored interpreta­ ing from the c1aim(s) before it; mere jux­ tion over rejected competing interpretations taposition may also suggest such a link must be justifiable in terms of the principles (though with no hint of the direction of above; hence, alternative interpretations that cannot be dismissed so must not be the inference), but it would be much neglected. weaker, less direct evidence. This principle deals with the (b) The Principle of Clarity: Present the argu­ overdeterminacy of the others. As noted ment as clearly as possible-be literal, precise, and terse.
Recommended publications
  • “The Church-Turing “Thesis” As a Special Corollary of Gödel's
    “The Church-Turing “Thesis” as a Special Corollary of Gödel’s Completeness Theorem,” in Computability: Turing, Gödel, Church, and Beyond, B. J. Copeland, C. Posy, and O. Shagrir (eds.), MIT Press (Cambridge), 2013, pp. 77-104. Saul A. Kripke This is the published version of the book chapter indicated above, which can be obtained from the publisher at https://mitpress.mit.edu/books/computability. It is reproduced here by permission of the publisher who holds the copyright. © The MIT Press The Church-Turing “ Thesis ” as a Special Corollary of G ö del ’ s 4 Completeness Theorem 1 Saul A. Kripke Traditionally, many writers, following Kleene (1952) , thought of the Church-Turing thesis as unprovable by its nature but having various strong arguments in its favor, including Turing ’ s analysis of human computation. More recently, the beauty, power, and obvious fundamental importance of this analysis — what Turing (1936) calls “ argument I ” — has led some writers to give an almost exclusive emphasis on this argument as the unique justification for the Church-Turing thesis. In this chapter I advocate an alternative justification, essentially presupposed by Turing himself in what he calls “ argument II. ” The idea is that computation is a special form of math- ematical deduction. Assuming the steps of the deduction can be stated in a first- order language, the Church-Turing thesis follows as a special case of G ö del ’ s completeness theorem (first-order algorithm theorem). I propose this idea as an alternative foundation for the Church-Turing thesis, both for human and machine computation. Clearly the relevant assumptions are justified for computations pres- ently known.
    [Show full text]
  • A Compositional Analysis for Subset Comparatives∗ Helena APARICIO TERRASA–University of Chicago
    A Compositional Analysis for Subset Comparatives∗ Helena APARICIO TERRASA–University of Chicago Abstract. Subset comparatives (Grant 2013) are amount comparatives in which there exists a set membership relation between the target and the standard of comparison. This paper argues that subset comparatives should be treated as regular phrasal comparatives with an added presupposi- tional component. More specifically, subset comparatives presuppose that: a) the standard has the property denoted by the target; and b) the standard has the property denoted by the matrix predi- cate. In the account developed below, the presuppositions of subset comparatives result from the compositional principles independently required to interpret those phrasal comparatives in which the standard is syntactically contained inside the target. Presuppositions are usually taken to be li- censed by certain lexical items (presupposition triggers). However, subset comparatives show that presuppositions can also arise as a result of semantic composition. This finding suggests that the grammar possesses more than one way of licensing these inferences. Further research will have to determine how productive this latter strategy is in natural languages. Keywords: Subset comparatives, presuppositions, amount comparatives, degrees. 1. Introduction Amount comparatives are usually discussed with respect to their degree or amount interpretation. This reading is exemplified in the comparative in (1), where the elements being compared are the cardinalities corresponding to the sets of books read by John and Mary respectively: (1) John read more books than Mary. |{x : books(x) ∧ John read x}| ≻ |{y : books(y) ∧ Mary read y}| In this paper, I discuss subset comparatives (Grant (to appear); Grant (2013)), a much less studied type of amount comparative illustrated in the Spanish1 example in (2):2 (2) Juan ha leído más libros que El Quijote.
    [Show full text]
  • Interpretations and Mathematical Logic: a Tutorial
    Interpretations and Mathematical Logic: a Tutorial Ali Enayat Dept. of Philosophy, Linguistics, and Theory of Science University of Gothenburg Olof Wijksgatan 6 PO Box 200, SE 405 30 Gothenburg, Sweden e-mail: [email protected] 1. INTRODUCTION Interpretations are about `seeing something as something else', an elusive yet intuitive notion that has been around since time immemorial. It was sys- tematized and elaborated by astrologers, mystics, and theologians, and later extensively employed and adapted by artists, poets, and scientists. At first ap- proximation, an interpretation is a structure-preserving mapping from a realm X to another realm Y , a mapping that is meant to bring hidden features of X and/or Y into view. For example, in the context of Freud's psychoanalytical theory [7], X = the realm of dreams, and Y = the realm of the unconscious; in Khayaam's quatrains [11], X = the metaphysical realm of theologians, and Y = the physical realm of human experience; and in von Neumann's foundations for Quantum Mechanics [14], X = the realm of Quantum Mechanics, and Y = the realm of Hilbert Space Theory. Turning to the domain of mathematics, familiar geometrical examples of in- terpretations include: Khayyam's interpretation of algebraic problems in terms of geometric ones (which was the remarkably innovative element in his solution of cubic equations), Descartes' reduction of Geometry to Algebra (which moved in a direction opposite to that of Khayyam's aforementioned interpretation), the Beltrami-Poincar´einterpretation of hyperbolic geometry within euclidean geom- etry [13]. Well-known examples in mathematical analysis include: Dedekind's interpretation of the linear continuum in terms of \cuts" of the rational line, Hamilton's interpertation of complex numbers as points in the Euclidean plane, and Cauchy's interpretation of real-valued integrals via complex-valued ones using the so-called contour method of integration in complex analysis.
    [Show full text]
  • Church's Thesis and the Conceptual Analysis of Computability
    Church’s Thesis and the Conceptual Analysis of Computability Michael Rescorla Abstract: Church’s thesis asserts that a number-theoretic function is intuitively computable if and only if it is recursive. A related thesis asserts that Turing’s work yields a conceptual analysis of the intuitive notion of numerical computability. I endorse Church’s thesis, but I argue against the related thesis. I argue that purported conceptual analyses based upon Turing’s work involve a subtle but persistent circularity. Turing machines manipulate syntactic entities. To specify which number-theoretic function a Turing machine computes, we must correlate these syntactic entities with numbers. I argue that, in providing this correlation, we must demand that the correlation itself be computable. Otherwise, the Turing machine will compute uncomputable functions. But if we presuppose the intuitive notion of a computable relation between syntactic entities and numbers, then our analysis of computability is circular.1 §1. Turing machines and number-theoretic functions A Turing machine manipulates syntactic entities: strings consisting of strokes and blanks. I restrict attention to Turing machines that possess two key properties. First, the machine eventually halts when supplied with an input of finitely many adjacent strokes. Second, when the 1 I am greatly indebted to helpful feedback from two anonymous referees from this journal, as well as from: C. Anthony Anderson, Adam Elga, Kevin Falvey, Warren Goldfarb, Richard Heck, Peter Koellner, Oystein Linnebo, Charles Parsons, Gualtiero Piccinini, and Stewart Shapiro. I received extremely helpful comments when I presented earlier versions of this paper at the UCLA Philosophy of Mathematics Workshop, especially from Joseph Almog, D.
    [Show full text]
  • How to Build an Interpretation and Use Argument
    Timothy S. Brophy Professor and Director, Institutional Assessment How to Build an University of Florida Email: [email protected] Interpretation and Phone: 352‐273‐4476 Use Argument To review the concept of validity as it applies to higher education Today’s Goals Provide a framework for developing an interpretation and use argument for assessments Validity What it is, and how we examine it • Validity refers to the degree to which evidence and theory support the interpretations of the test scores for proposed uses of tests. (p. 11) defined Source: • American Educational Research Association (AERA), American Psychological Association (APA), & National Council on Measurement in Education (NCME). (2014). Standards for educational and psychological testing. Validity Washington, DC: AERA The Importance of Validity The process of validation Validity is, therefore, the most involves accumulating relevant fundamental consideration in evidence to provide a sound developing tests and scientific basis for the proposed assessments. score and assessment results interpretations. (p. 11) Source: AERA, APA, & NCME. (2014). Standards for educational and psychological testing. Washington, DC: AERA • Most often this is qualitative; colleagues are a good resource • Review the evidence How Do We • Common sources of validity evidence Examine • Test Content Validity in • Construct (the idea or theory that supports the Higher assessment) • The validity coefficient Education? Important distinction It is not the test or assessment itself that is validated, but the inferences one makes from the measure based on the context of its use. Therefore it is not appropriate to refer to the ‘validity of the test or assessment’ ‐ instead, we refer to the validity of the interpretation of the results for the test or assessment’s intended purpose.
    [Show full text]
  • Why We Should Abandon the Semantic Subset Principle
    LANGUAGE LEARNING AND DEVELOPMENT https://doi.org/10.1080/15475441.2018.1499517 Why We Should Abandon the Semantic Subset Principle Julien Musolinoa, Kelsey Laity d’Agostinob, and Steve Piantadosic aDepartment of Psychology and Center for Cognitive Science, Rutgers University, Piscataway, NJ, USA; bDepartment of Philosophy, Rutgers University, New Brunswick, USA; cDepartment of Brain and Cognitive Sciences, University of Rochester, Rochester, USA ABSTRACT In a recent article published in this journal, Moscati and Crain (M&C) show- case the explanatory power of a learnability constraint called the Semantic Subset Principle (SSP) (Crain et al. 1994). If correct, M&C’s argument would represent a compelling demonstration of the operation of an innate, domain specific, learning principle. However, in trying to make the case for the SSP, M&C fail to clearly define their hypothesis, omit discussion of key facts, and contradict the theory itself. Moreover, their presentation does not address the core arguments and alternatives against their account available in the literature and misrepresents the position of their critics. Once these shortcomings are understood, a very different conclusion emerges: its historical significance notwithstanding, the SSP represents a flawed and obsolete account that should be abandoned. We discuss the implications of the demise of the SSP and describe more powerful and plausible alternatives that provide a better foundation for ongoing work in language acquisition. Introduction A common assumption within generative models of language acquisition is that learners are guided by a built-in constraint called the Subset Principle (SP) (Baker, 1979; Pinker, 1979; Dell, 1981; Berwick, 1985; Manzini & Wexler, 1987; among others).
    [Show full text]
  • A Description and Interpretation of the NPGS Birdsfoot Trefoil Core Subset Collection
    A Description and Interpretation of the NPGS Birdsfoot Trefoil Core Subset Collection J. J. Steiner,* P. R. Beuselinck, S. L. Greene, J. A Kamm, J. H. Kirkbride, and C. A. Roberts ABSTRACT active collection. This should increase evaluation effi- Systematic evaluations for a range of traits from accessions in ciency, reduce the number of accession requests, and re- National Plant Germplasm System (NPGS) core subset collections duce the frequency and costs for collection regeneration should assist collection management and enhance germplasm utiliza- (Greene and McFerson, 1994). Well characterized collec- tion. The objectives of this research were to: (i) evaluate the 48-accession tions can help plant explorers develop plans for collect- NPGS birdsfoot trefoil (Lotus corniculatus L.) core subset collection ing unique genetic materials that are not represented by by means of a variety of biochemical, morphological, and agronomic present ex situ collection holdings (Steiner and Greene, characters; (ii) determine how these characters were distributed among 1996; Greene et al., 1999a,b). the core subset accessions and associations among plant and ecogeo- In the past, most core subset collections were based graphic characters; (iii) define genetic diversity pools on the basis of on just a few traits or the geographic origins of the acces- descriptor interpretive groups; and (iv) develop a method to utilize the core subset as a reference collection to evaluate newly acquired acces- sions. Assessing a range of collection descriptors, includ- sions for their similarity or novelty. Geographic information system ing plant morphology, biochemistry, and collecting site (GIS) databases were used to estimate the ecogeography of accession habitat, has advantages that may be lost when only a sin- origins.
    [Show full text]
  • On the No-Counterexample Interpretation Basic Research in Computer Science
    BRICS Basic Research in Computer Science BRICS RS-97-42 U. Kohlenbach: On the No-Counterexample Interpretation On the No-Counterexample Interpretation Ulrich Kohlenbach BRICS Report Series RS-97-42 ISSN 0909-0878 December 1997 Copyright c 1997, BRICS, Department of Computer Science University of Aarhus. All rights reserved. Reproduction of all or part of this work is permitted for educational or research use on condition that this copyright notice is included in any copy. See back inner page for a list of recent BRICS Report Series publications. Copies may be obtained by contacting: BRICS Department of Computer Science University of Aarhus Ny Munkegade, building 540 DK–8000 Aarhus C Denmark Telephone: +45 8942 3360 Telefax: +45 8942 3255 Internet: [email protected] BRICS publications are in general accessible through the World Wide Web and anonymous FTP through these URLs: http://www.brics.dk ftp://ftp.brics.dk This document in subdirectory RS/97/42/ On the no-counterexample interpretation Ulrich Kohlenbach BRICS∗ Department of Computer Science University of Aarhus Ny Munkegade, Bldg. 540 DK-8000 Aarhus C, Denmark [email protected] December 1997 Abstract In [15],[16] Kreisel introduced the no-counterexample interpretation (n.c.i.) of Peano arithmetic. In particular he proved, using a complicated ε-substitution method (due to W. Ackermann), that for every theorem A (A prenex) of first-order Peano arithmetic PA one can find ordinal recursive functionals ΦA of order type <ε0 which realize the Herbrand normal form AH of A. Subsequently more perspicuous proofs of this fact via functional interpretation (com- bined with normalization) and cut-elimination where found.
    [Show full text]
  • Lambda Calculus and Computation 6.037 – Structure and Interpretation of Computer Programs
    Lambda Calculus and Computation 6.037 { Structure and Interpretation of Computer Programs Benjamin Barenblat [email protected] Massachusetts Institute of Technology With material from Mike Phillips, Nelson Elhage, and Chelsea Voss January 30, 2019 Benjamin Barenblat 6.037 Lambda Calculus and Computation : Build a calculating machine that gives a yes/no answer to all mathematical questions. Figure: Alonzo Church Figure: Alan Turing (1912-1954), (1903-1995), lambda calculus Turing machines Theorem (Church, Turing, 1936): These models of computation can't solve every problem. Proof: next! Limits to Computation David Hilbert's Entscheidungsproblem (1928) Benjamin Barenblat 6.037 Lambda Calculus and Computation Figure: Alonzo Church Figure: Alan Turing (1912-1954), (1903-1995), lambda calculus Turing machines Theorem (Church, Turing, 1936): These models of computation can't solve every problem. Proof: next! Limits to Computation David Hilbert's Entscheidungsproblem (1928): Build a calculating machine that gives a yes/no answer to all mathematical questions. Benjamin Barenblat 6.037 Lambda Calculus and Computation Theorem (Church, Turing, 1936): These models of computation can't solve every problem. Proof: next! Limits to Computation David Hilbert's Entscheidungsproblem (1928): Build a calculating machine that gives a yes/no answer to all mathematical questions. Figure: Alonzo Church Figure: Alan Turing (1912-1954), (1903-1995), lambda calculus Turing machines Benjamin Barenblat 6.037 Lambda Calculus and Computation Proof: next! Limits to Computation David Hilbert's Entscheidungsproblem (1928): Build a calculating machine that gives a yes/no answer to all mathematical questions. Figure: Alonzo Church Figure: Alan Turing (1912-1954), (1903-1995), lambda calculus Turing machines Theorem (Church, Turing, 1936): These models of computation can't solve every problem.
    [Show full text]
  • Quantum Set Theory Extending the Standard Probabilistic Interpretation of Quantum Theory (Extended Abstract)
    Quantum Set Theory Extending the Standard Probabilistic Interpretation of Quantum Theory (Extended Abstract) Masanao Ozawa∗ Graduate School of Information Science, Nagoya University Nagoya, Japan [email protected] The notion of equality between two observables will play many important roles in foundations of quantum theory. However, the standard probabilistic interpretation based on the conventional Born formula does not give the probability of equality relation for a pair of arbitrary observables, since the Born formula gives the probability distribution only for a commuting family of observables. In this paper, quantum set theory developed by Takeuti and the present author is used to systematically extend the probabilistic interpretation of quantum theory to define the probability of equality relation for a pair of arbitrary observables. Applications of this new interpretation to measurement theory are discussed briefly. 1 Introduction Set theory provides foundations of mathematics. All the mathematical notions like numbers, functions, relations, and structures are defined in the axiomatic set theory, ZFC (Zermelo-Fraenkel set theory with the axiom of choice), and all the mathematical theorems are required to be provable in ZFC [15]. Quan- tum set theory instituted by Takeuti [14] and developed by the present author [11] naturally extends the logical basis of set theory from classical logic to quantum logic [1]. Accordingly, quantum set theory extends quantum logical approach to quantum foundations from propositional logic to predicate logic and set theory. Hence, we can expect that quantum set theory will provide much more systematic inter- pretation of quantum theory than the conventional quantum logic approach [3]. The notion of equality between quantum observables will play many important roles in foundations of quantum theory, in particular, in the theory of measurement and disturbance [9, 10].
    [Show full text]
  • An Information-Theoretic Perspective on Model Interpretation
    Learning to Explain: An Information-Theoretic Perspective on Model Interpretation Jianbo Chen 1 2 Le Song 3 4 Martin J. Wainwright 1 Michael I. Jordan 1 Abstract chine learning is feature selection, which selects a subset of We introduce instancewise feature selection as features that are useful to build a good predictor for a spec- a methodology for model interpretation. Our ified response variable (Guyon & Elisseeff, 2003). While method is based on learning a function to ex- feature selection produces a global importance of features tract a subset of features that are most informative with respect to the entire labeled data set, instancewise fea- for each given example. This feature selector is ture selection measures feature importance locally for each trained to maximize the mutual information be- instance labeled by the model. tween selected features and the response variable, Existing work on interpreting models approach the problem where the conditional distribution of the response from two directions. The first line of work computes the variable given the input is the model to be ex- gradient of the output of the correct class with respect to the plained. We develop an efficient variational ap- input vector for the given model, and uses it as a saliency proximation to the mutual information, and show map for masking the input (Simonyan et al., 2013; Springen- the effectiveness of our method on a variety of berg et al., 2014). The gradient is computed using a Parzen synthetic and real data sets using both quantitative window approximation of the original classifier if the origi- metrics and human evaluation.
    [Show full text]
  • On Bar Recursive Interpretations of Analysis. Powell, Thomas Rhidian John
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Queen Mary Research Online On Bar Recursive Interpretations of Analysis. Powell, Thomas Rhidian John. The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without the prior written consent of the author For additional information about this publication click this link. http://qmro.qmul.ac.uk/jspui/handle/123456789/8637 Information about this research object was correct at the time of download; we occasionally make corrections to records, please therefore check the published record when citing. For more information contact [email protected] On Bar Recursive Interpretations of Analysis A Dissertation Submitted in Partial Fulfillment of the Requirements of the Degree of Doctor of Philosophy, Queen Mary University of London By Thomas Rhidian John Powell August 2013 ii Abstract This dissertation concerns the computational interpretation of analysis via proof interpre- tations, and examines the variants of bar recursion that have been used to interpret the axiom of choice. It consists of an applied and a theoretical component. The applied part contains a series of case studies which address the issue of understand- ing the meaning and behaviour of bar recursive programs extracted from proofs in analysis. Taking as a starting point recent work of Escard´oand Oliva on the product of selection functions, solutions to G¨odel'sfunctional interpretation of several well known theorems of mathematics are given, and the semantics of the extracted programs described. In particular, new game-theoretic computational interpretations are found for weak K¨onig's 0 lemma for Σ1-trees and for the minimal-bad-sequence argument.
    [Show full text]