On Computability.Pdf

Total Page:16

File Type:pdf, Size:1020Kb

On Computability.Pdf ON COMPUTABILITY Wilfried Sieg 1 INTRODUCTION Computability is perhaps the most significant and distinctive notion modern logic has introduced; in the guise of decidability and effective calculability it has a venerable history within philosophy and mathematics. Now it is also the basic theoretical concept for computer science, artificial intelligence and cognitive sci- ence. This essay discusses, at its heart, methodological issues that are central to any mathematical theory that is to reflect parts of our physical or intellec- tual experience. The discussion is grounded in historical developments that are deeply intertwined with meta-mathematical work in the foundations of mathemat- ics. How is that possible, the reader might ask, when the essay is concerned solely with computability? This introduction begins to give an answer by first describ- ing the context of foundational investigations in logic and mathematics and then sketching the main lines of the systematic presentation. 1.1 Foundational contexts In the second half of the 19th century the issues of decidability and effective calcu- lability rose to the fore in discussions concerning the nature of mathematics. The divisive character of these discussions is reflected in the tensions between Dedekind and Kronecker, each holding broad methodological views that affected deeply their scientific practice. Dedekind contributed perhaps most to the radical transforma- tion that led to modern mathematics: he introduced abstract axiomatizations in parts of the subject (e.g., algebraic number theory) and in the foundations for arithmetic and analysis. Kronecker is well known for opposing that high level of structuralist abstraction and insisting, instead, on the decidability of notions and the effective construction of mathematical objects from the natural numbers. Kronecker’s concerns were of a traditional sort and were recognized as perfectly legitimate by Hilbert and others, as long as they were positively directed towards the effective solution of mathematical problems and not negatively used to restrict the free creations of the mathematical mind. At the turn of the 20th century, these structuralist tendencies found an impor- tant expression in Hilbert’s book Grundlagen der Geometrie and in his essay Uber¨ den Zahlbegriff. Hilbert was concerned, as Dedekind had been, with the consis- tency of the abstract notions and tried to address the issue also within a broad set Handbook of the Philosophy of Science. Philosophy of Mathematics Volume editor: Andrew Irvine. General editors: Dov M. Gabbay, Paul Thagard and John Woods. c 2008 Elsevier BV. All rights reserved. 526 Wilfried Sieg theoretic/logicist framework. The framework could have already been sharpened at that point by adopting the contemporaneous development of Frege’s Begriffs- schrift, but that was not done until the late 1910s, when Russell and Whitehead’s work had been absorbed in the Hilbert School. This rather circuitous development is apparent from Hilbert and Bernays’ lectures [1917/18] and the many founda- tional lectures Hilbert gave between 1900 and the summer semester of 1917. Apart from using a version of Principia Mathematica as the frame for formalizing math- ematics in a direct way, Hilbert and Bernays pursued a dramatically different approach with a sharp focus on meta-mathematical questions like the semantic completeness of logical calculi and the syntactic consistency of mathematical the- ories. In his Habilitationsschrift of 1918, Bernays established the semantic complete- ness for the sentential logic of Principia Mathematica and presented a system of provably independent axioms. The completeness result turned the truth-table test for validity (or logical truth) into an effective criterion for provability in the logical calculus. This latter problem has a long and distinguished history in philosophy and logic, and its pre-history reaches back at least to Leibniz. I am alluding of course to the decision problem (“Entscheidungsproblem”). Its classical formula- tion for first-order logic is found in Hilbert and Ackermann’s book Grundz¨uge der theoretischen Logik. This problem was viewed as the main problem of mathemat- ical logic and begged for a rigorous definition of mechanical procedure or finite decision procedure. How intricately the “Entscheidungsproblem” is connected with broad perspec- tives on the nature of mathematics is brought out by an amusingly illogical argu- ment in von Neumann’s essay Zur Hilbertschen Beweistheorie from 1927: . it appears that there is no way of finding the general criterion for deciding whether or not a well-formed formula a is provable. (We cannot at the moment establish this. Indeed, we have no clue as to how such a proof of undecidability would go.) . the undecidability is even a conditio sine qua non for the contemporary practice of mathematics, using as it does heuristic methods, to make any sense. The very day on which the undecidability does not obtain any more, mathematics as we now understand it would cease to exist; it would be replaced by an absolutely mechanical prescription (eine absolut mechanische Vorschrift) by means of which anyone could decide the provability or unprovability of any given sentence. Thus we have to take the position: it is generally undecidable, whether a given well-formed formula is provable or not. If the underlying conceptual problem had been attacked directly, then something like Post’s unpublished investigations from the 1920s would have been carried out in G¨ottingen. A different and indirect approach evolved instead, whose origins can be traced back to the use of calculable number theoretic functions in finitist con- sistency proofs for parts of arithmetic. Here we find the most concrete beginning On Computability 527 of the history of modern computability with close ties to earlier mathematical and later logical developments. There is a second sense in which “foundational context” can be taken, not as referring to work in the foundations of mathematics, but directly in modern logic and cognitive science. Without a deeper understanding of the nature of calculation and underlying processes, neither the scope of undecidability and incompleteness results nor the significance of computational models in cognitive science can be explored in their proper generality. The claim for logic is almost trivial and implies the claim for cognitive science. After all, the relevant logical notions have been used when striving to create artificial intelligence or to model mental processes in humans. These foundational problems come strikingly to the fore in arguments for Church’s or Turing’s Thesis, asserting that an informal notion of effective calcu- lability is captured fully by a particular precise mathematical concept. Church’s Thesis, for example, claims in its original form that the effectively calculable num- ber theoretic functions are exactly those functions whose values are computable in G¨odel’s equational calculus, i.e., the general recursive functions. There is general agreement that Turing gave the most convincing analysis of effective calculability in his 1936 paper On computable numbers — with an appli- cation to the Entscheidungsproblem. It is Turing’s distinctive philosophical con- tribution that he brought the computing agent into the center of the analysis and that was for Turing a human being, proceeding mechanically.1 Turing’s student Gandy followed in his [1980] the outline of Turing’s work in his analysis of ma- chine computability. Their work is not only closely examined in this essay, but also thoroughly recast. In the end, the detailed conceptual analysis presented be- low yields rigorous characterizations that dispense with theses, reveal human and machine computability as axiomatically given mathematical concepts and allow their systematic reduction to Turing computability. 1.2 Overview The core of section 2 is devoted to decidability and calculability. Dedekind intro- duced in his essay Was sind und was sollen die Zahlen? the general concept of a “(primitive) recursive” function and proved that these functions can be made explicit in his logicist framework. Beginning in 1921, these obviously calculable functions were used prominently in Hilbert’s work on the foundations of math- ematics, i.e., in the particular way he conceived of finitist mathematics and its role in consistency proofs. Hilbert’s student Ackermann discovered already be- fore 1925 a non-primitive recursive function that was nevertheless calculable. In 1931, Herbrand, working on Hilbert’s consistency problem, gave a very general and open-ended characterization of “finitistically calculable number-theoretic func- tions” that included also the Ackermann function. This section emphasizes the 1The Shorter Oxford English Dictionary makes perfectly clear that mechanical, when applied to a person or action, means “performing or performed without thought; lacking spontaneity or originality; machine-like; automatic, routine.” 528 Wilfried Sieg broader intellectual context and points to the rather informal and epistemologi- cally motivated demand that, in the development of logic and mathematics, certain notions (for example, proof) should be decidable by humans and others should not (for example, theorem). The crucial point is that the core concepts were deeply intertwined with mathematical practice and logical tradition before they came to- gether in Hilbert’s consistency program or, more generally, in meta-mathematics. In section 3, entitled Recursiveness and Church’s Thesis, we
Recommended publications
  • Computability of Fraïssé Limits
    COMPUTABILITY OF FRA¨ISSE´ LIMITS BARBARA F. CSIMA, VALENTINA S. HARIZANOV, RUSSELL MILLER, AND ANTONIO MONTALBAN´ Abstract. Fra¨ıss´estudied countable structures S through analysis of the age of S, i.e., the set of all finitely generated substructures of S. We investigate the effectiveness of his analysis, considering effectively presented lists of finitely generated structures and asking when such a list is the age of a computable structure. We focus particularly on the Fra¨ıss´elimit. We also show that degree spectra of relations on a sufficiently nice Fra¨ıss´elimit are always upward closed unless the relation is definable by a quantifier-free formula. We give some sufficient or necessary conditions for a Fra¨ıss´elimit to be spectrally universal. As an application, we prove that the computable atomless Boolean algebra is spectrally universal. Contents 1. Introduction1 1.1. Classical results about Fra¨ıss´elimits and background definitions4 2. Computable Ages5 3. Computable Fra¨ıss´elimits8 3.1. Computable properties of Fra¨ıss´elimits8 3.2. Existence of computable Fra¨ıss´elimits9 4. Examples 15 5. Upward closure of degree spectra of relations 18 6. Necessary conditions for spectral universality 20 6.1. Local finiteness 20 6.2. Finite realizability 21 7. A sufficient condition for spectral universality 22 7.1. The countable atomless Boolean algebra 23 References 24 1. Introduction Computable model theory studies the algorithmic complexity of countable structures, of their isomorphisms, and of relations on such structures. Since algorithmic properties often depend on data presentation, in computable model theory classically isomorphic structures can have different computability-theoretic properties.
    [Show full text]
  • Artificial Consciousness and the Consciousness-Attention Dissociation
    Consciousness and Cognition 45 (2016) 210–225 Contents lists available at ScienceDirect Consciousness and Cognition journal homepage: www.elsevier.com/locate/concog Review article Artificial consciousness and the consciousness-attention dissociation ⇑ Harry Haroutioun Haladjian a, , Carlos Montemayor b a Laboratoire Psychologie de la Perception, CNRS (UMR 8242), Université Paris Descartes, Centre Biomédical des Saints-Pères, 45 rue des Saints-Pères, 75006 Paris, France b San Francisco State University, Philosophy Department, 1600 Holloway Avenue, San Francisco, CA 94132 USA article info abstract Article history: Artificial Intelligence is at a turning point, with a substantial increase in projects aiming to Received 6 July 2016 implement sophisticated forms of human intelligence in machines. This research attempts Accepted 12 August 2016 to model specific forms of intelligence through brute-force search heuristics and also reproduce features of human perception and cognition, including emotions. Such goals have implications for artificial consciousness, with some arguing that it will be achievable Keywords: once we overcome short-term engineering challenges. We believe, however, that phenom- Artificial intelligence enal consciousness cannot be implemented in machines. This becomes clear when consid- Artificial consciousness ering emotions and examining the dissociation between consciousness and attention in Consciousness Visual attention humans. While we may be able to program ethical behavior based on rules and machine Phenomenology learning, we will never be able to reproduce emotions or empathy by programming such Emotions control systems—these will be merely simulations. Arguments in favor of this claim include Empathy considerations about evolution, the neuropsychological aspects of emotions, and the disso- ciation between attention and consciousness found in humans.
    [Show full text]
  • 31 Summary of Computability Theory
    CS:4330 Theory of Computation Spring 2018 Computability Theory Summary Haniel Barbosa Readings for this lecture Chapters 3-5 and Section 6.2 of [Sipser 1996], 3rd edition. A hierachy of languages n m B Regular: a b n n B Deterministic Context-free: a b n n n 2n B Context-free: a b [ a b n n n B Turing decidable: a b c B Turing recognizable: ATM 1 / 12 Why TMs? B In 1900: Hilbert posed 23 “challenge problems” in Mathematics The 10th problem: Devise a process according to which it can be decided by a finite number of operations if a given polynomial has an integral root. It became necessary to have a formal definition of “algorithms” to define their expressivity. 2 / 12 Church-Turing Thesis B In 1936 Church and Turing independently defined “algorithm”: I λ-calculus I Turing machines B Intuitive notion of algorithms = Turing machine algorithms B “Any process which could be naturally called an effective procedure can be realized by a Turing machine” th B We now know: Hilbert’s 10 problem is undecidable! 3 / 12 Algorithm as Turing Machine Definition (Algorithm) An algorithm is a decider TM in the standard representation. B The input to a TM is always a string. B If we want an object other than a string as input, we must first represent that object as a string. B Strings can easily represent polynomials, graphs, grammars, automata, and any combination of these objects. 4 / 12 How to determine decidability / Turing-recognizability? B Decidable / Turing-recognizable: I Present a TM that decides (recognizes) the language I If A is mapping reducible to
    [Show full text]
  • Computability and Complexity
    Computability and Complexity Lecture Notes Herbert Jaeger, Jacobs University Bremen Version history Jan 30, 2018: created as copy of CC lecture notes of Spring 2017 Feb 16, 2018: cleaned up font conversion hickups up to Section 5 Feb 23, 2018: cleaned up font conversion hickups up to Section 6 Mar 15, 2018: cleaned up font conversion hickups up to Section 7 Apr 5, 2018: cleaned up font conversion hickups up to the end of these lecture notes 1 1 Introduction 1.1 Motivation This lecture will introduce you to the theory of computation and the theory of computational complexity. The theory of computation offers full answers to the questions, • what problems can in principle be solved by computer programs? • what functions can in principle be computed by computer programs? • what formal languages can in principle be decided by computer programs? Full answers to these questions have been found in the last 70 years or so, and we will learn about them. (And it turns out that these questions are all the same question). The theory of computation is well-established, transparent, and basically simple (you might disagree at first). The theory of complexity offers many insights to questions like • for a given problem / function / language that has to be solved / computed / decided by a computer program, how long does the fastest program actually run? • how much memory space has to be used at least? • can you speed up computations by using different computer architectures or different programming approaches? The theory of complexity is historically younger than the theory of computation – the first surge of results came in the 60ties of last century.
    [Show full text]
  • Chapter 1 Logic and Set Theory
    Chapter 1 Logic and Set Theory To criticize mathematics for its abstraction is to miss the point entirely. Abstraction is what makes mathematics work. If you concentrate too closely on too limited an application of a mathematical idea, you rob the mathematician of his most important tools: analogy, generality, and simplicity. – Ian Stewart Does God play dice? The mathematics of chaos In mathematics, a proof is a demonstration that, assuming certain axioms, some statement is necessarily true. That is, a proof is a logical argument, not an empir- ical one. One must demonstrate that a proposition is true in all cases before it is considered a theorem of mathematics. An unproven proposition for which there is some sort of empirical evidence is known as a conjecture. Mathematical logic is the framework upon which rigorous proofs are built. It is the study of the principles and criteria of valid inference and demonstrations. Logicians have analyzed set theory in great details, formulating a collection of axioms that affords a broad enough and strong enough foundation to mathematical reasoning. The standard form of axiomatic set theory is denoted ZFC and it consists of the Zermelo-Fraenkel (ZF) axioms combined with the axiom of choice (C). Each of the axioms included in this theory expresses a property of sets that is widely accepted by mathematicians. It is unfortunately true that careless use of set theory can lead to contradictions. Avoiding such contradictions was one of the original motivations for the axiomatization of set theory. 1 2 CHAPTER 1. LOGIC AND SET THEORY A rigorous analysis of set theory belongs to the foundations of mathematics and mathematical logic.
    [Show full text]
  • Inventing Computational Rhetoric
    INVENTING COMPUTATIONAL RHETORIC By Michael W. Wojcik A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Digital Rhetoric and Professional Writing — Master of Arts 2013 ABSTRACT INVENTING COMPUTATIONAL RHETORIC by Michael W. Wojcik Many disciplines in the humanities are developing “computational” branches which make use of information technology to process large amounts of data algorithmically. The field of computational rhetoric is in its infancy, but we are already seeing interesting results from applying the ideas and goals of rhetoric to text processing and related areas. After considering what computational rhetoric might be, three approaches to inventing computational rhetorics are presented: a structural schema, a review of extant work, and a theoretical exploration. Copyright by MICHAEL W. WOJCIK 2013 For Malea iv ACKNOWLEDGEMENTS Above all else I must thank my beloved wife, Malea Powell, without whose prompting this thesis would have remained forever incomplete. I am also grateful for the presence in my life of my terrific stepdaughter, Audrey Swartz, and wonderful granddaughter Lucille. My thesis committee, Dean Rehberger, Bill Hart-Davidson, and John Monberg, pro- vided me with generous guidance and inspiration. Other faculty members at Michigan State who helped me explore relevant ideas include Rochelle Harris, Mike McLeod, Joyce Chai, Danielle Devoss, and Bump Halbritter. My previous academic program at Miami University did not result in a degree, but faculty there also contributed greatly to my the- oretical understanding, particularly Susan Morgan, Mary-Jean Corbett, Brit Harwood, J. Edgar Tidwell, Lori Merish, Vicki Smith, Alice Adams, Fran Dolan, and Keith Tuma.
    [Show full text]
  • John P. Burgess Department of Philosophy Princeton University Princeton, NJ 08544-1006, USA [email protected]
    John P. Burgess Department of Philosophy Princeton University Princeton, NJ 08544-1006, USA [email protected] LOGIC & PHILOSOPHICAL METHODOLOGY Introduction For present purposes “logic” will be understood to mean the subject whose development is described in Kneale & Kneale [1961] and of which a concise history is given in Scholz [1961]. As the terminological discussion at the beginning of the latter reference makes clear, this subject has at different times been known by different names, “analytics” and “organon” and “dialectic”, while inversely the name “logic” has at different times been applied much more broadly and loosely than it will be here. At certain times and in certain places — perhaps especially in Germany from the days of Kant through the days of Hegel — the label has come to be used so very broadly and loosely as to threaten to take in nearly the whole of metaphysics and epistemology. Logic in our sense has often been distinguished from “logic” in other, sometimes unmanageably broad and loose, senses by adding the adjectives “formal” or “deductive”. The scope of the art and science of logic, once one gets beyond elementary logic of the kind covered in introductory textbooks, is indicated by two other standard references, the Handbooks of mathematical and philosophical logic, Barwise [1977] and Gabbay & Guenthner [1983-89], though the latter includes also parts that are identified as applications of logic rather than logic proper. The term “philosophical logic” as currently used, for instance, in the Journal of Philosophical Logic, is a near-synonym for “nonclassical logic”. There is an older use of the term as a near-synonym for “philosophy of language”.
    [Show full text]
  • Basic Concepts of Set Theory, Functions and Relations 1. Basic
    Ling 310, adapted from UMass Ling 409, Partee lecture notes March 1, 2006 p. 1 Basic Concepts of Set Theory, Functions and Relations 1. Basic Concepts of Set Theory........................................................................................................................1 1.1. Sets and elements ...................................................................................................................................1 1.2. Specification of sets ...............................................................................................................................2 1.3. Identity and cardinality ..........................................................................................................................3 1.4. Subsets ...................................................................................................................................................4 1.5. Power sets .............................................................................................................................................4 1.6. Operations on sets: union, intersection...................................................................................................4 1.7 More operations on sets: difference, complement...................................................................................5 1.8. Set-theoretic equalities ...........................................................................................................................5 Chapter 2. Relations and Functions ..................................................................................................................6
    [Show full text]
  • Are Large Cardinal Axioms Restrictive?
    Are Large Cardinal Axioms Restrictive? Neil Barton∗ 24 June 2020y Abstract The independence phenomenon in set theory, while perva- sive, can be partially addressed through the use of large cardinal axioms. A commonly assumed idea is that large cardinal axioms are species of maximality principles. In this paper, I argue that whether or not large cardinal axioms count as maximality prin- ciples depends on prior commitments concerning the richness of the subset forming operation. In particular I argue that there is a conception of maximality through absoluteness, on which large cardinal axioms are restrictive. I argue, however, that large cardi- nals are still important axioms of set theory and can play many of their usual foundational roles. Introduction Large cardinal axioms are widely viewed as some of the best candi- dates for new axioms of set theory. They are (apparently) linearly ordered by consistency strength, have substantial mathematical con- sequences for questions independent from ZFC (such as consistency statements and Projective Determinacy1), and appear natural to the ∗Fachbereich Philosophie, University of Konstanz. E-mail: neil.barton@uni- konstanz.de. yI would like to thank David Aspero,´ David Fernandez-Bret´ on,´ Monroe Eskew, Sy-David Friedman, Victoria Gitman, Luca Incurvati, Michael Potter, Chris Scam- bler, Giorgio Venturi, Matteo Viale, Kameryn Williams and audiences in Cambridge, New York, Konstanz, and Sao˜ Paulo for helpful discussion. Two anonymous ref- erees also provided helpful comments, and I am grateful for their input. I am also very grateful for the generous support of the FWF (Austrian Science Fund) through Project P 28420 (The Hyperuniverse Programme) and the VolkswagenStiftung through the project Forcing: Conceptual Change in the Foundations of Mathematics.
    [Show full text]
  • THE 1910 PRINCIPIA's THEORY of FUNCTIONS and CLASSES and the THEORY of DESCRIPTIONS*
    EUJAP VOL. 3 No. 2 2007 ORIGinal SCienTifiC papeR UDK: 165 THE 1910 PRINCIPIA’S THEORY OF FUNCTIONS AND CLASSES AND THE THEORY OF DESCRIPTIONS* WILLIAM DEMOPOULOS** The University of Western Ontario ABSTRACT 1. Introduction It is generally acknowledged that the 1910 Prin- The 19101 Principia’s theory of proposi- cipia does not deny the existence of classes, but tional functions and classes is officially claims only that the theory it advances can be developed so that any apparent commitment to a “no-classes theory of classes,” a theory them is eliminable by the method of contextual according to which classes are eliminable. analysis. The application of contextual analysis But it is clear from Principia’s solution to ontological questions is widely viewed as the to the class paradoxes that although the central philosophical innovation of Russell’s theory of descriptions. Principia’s “no-classes theory it advances holds that classes are theory of classes” is a striking example of such eliminable, it does not deny their exis- an application. The present paper develops a re- tence. Whitehead and Russell argue from construction of Principia’s theory of functions the supposition that classes involve or and classes that is based on Russell’s epistemo- logical applications of the method of contextual presuppose propositional functions to the analysis. Such a reconstruction is not eliminativ- conclusion that the paradoxical classes ist—indeed, it explicitly assumes the existence of are excluded by the nature of such func- classes—and possesses certain advantages over tions. This supposition rests on the repre- the no–classes theory advocated by Whitehead and Russell.
    [Show full text]
  • Computability Theory
    CSC 438F/2404F Notes (S. Cook and T. Pitassi) Fall, 2019 Computability Theory This section is partly inspired by the material in \A Course in Mathematical Logic" by Bell and Machover, Chap 6, sections 1-10. Other references: \Introduction to the theory of computation" by Michael Sipser, and \Com- putability, Complexity, and Languages" by M. Davis and E. Weyuker. Our first goal is to give a formal definition for what it means for a function on N to be com- putable by an algorithm. Historically the first convincing such definition was given by Alan Turing in 1936, in his paper which introduced what we now call Turing machines. Slightly before Turing, Alonzo Church gave a definition based on his lambda calculus. About the same time G¨odel,Herbrand, and Kleene developed definitions based on recursion schemes. Fortunately all of these definitions are equivalent, and each of many other definitions pro- posed later are also equivalent to Turing's definition. This has lead to the general belief that these definitions have got it right, and this assertion is roughly what we now call \Church's Thesis". A natural definition of computable function f on N allows for the possibility that f(x) may not be defined for all x 2 N, because algorithms do not always halt. Thus we will use the symbol 1 to mean “undefined". Definition: A partial function is a function n f :(N [ f1g) ! N [ f1g; n ≥ 0 such that f(c1; :::; cn) = 1 if some ci = 1. In the context of computability theory, whenever we refer to a function on N, we mean a partial function in the above sense.
    [Show full text]
  • A Short History of Computational Complexity
    The Computational Complexity Column by Lance FORTNOW NEC Laboratories America 4 Independence Way, Princeton, NJ 08540, USA [email protected] http://www.neci.nj.nec.com/homepages/fortnow/beatcs Every third year the Conference on Computational Complexity is held in Europe and this summer the University of Aarhus (Denmark) will host the meeting July 7-10. More details at the conference web page http://www.computationalcomplexity.org This month we present a historical view of computational complexity written by Steve Homer and myself. This is a preliminary version of a chapter to be included in an upcoming North-Holland Handbook of the History of Mathematical Logic edited by Dirk van Dalen, John Dawson and Aki Kanamori. A Short History of Computational Complexity Lance Fortnow1 Steve Homer2 NEC Research Institute Computer Science Department 4 Independence Way Boston University Princeton, NJ 08540 111 Cummington Street Boston, MA 02215 1 Introduction It all started with a machine. In 1936, Turing developed his theoretical com- putational model. He based his model on how he perceived mathematicians think. As digital computers were developed in the 40's and 50's, the Turing machine proved itself as the right theoretical model for computation. Quickly though we discovered that the basic Turing machine model fails to account for the amount of time or memory needed by a computer, a critical issue today but even more so in those early days of computing. The key idea to measure time and space as a function of the length of the input came in the early 1960's by Hartmanis and Stearns.
    [Show full text]