A Non-Deterministic Call-By-Need Lambda Calculus: Proving Similarity a Precongruence by an Extension of Howe’S Method to Sharing

Total Page:16

File Type:pdf, Size:1020Kb

A Non-Deterministic Call-By-Need Lambda Calculus: Proving Similarity a Precongruence by an Extension of Howe’S Method to Sharing A Non-Deterministic Call-by-Need Lambda Calculus: Proving Similarity a Precongruence by an Extension of Howe's Method to Sharing Dissertation zur Erlangung des Doktorgrades der Naturwissenschaften vorgelegt beim Fachbereich 15 der Johann Wolfgang Goethe{Universit¨at in Frankfurt am Main von Matthias Mann aus Frankfurt Frankfurt 2005 (D F 1) vom Fachbereich Informatik und Mathematik der Johann Wolfgang Goethe{ Universit¨at als Dissertation angenommen. Dekan: Prof. Dr.-Ing. Detlef Kr¨omker Gutachter: Prof. Dr. Manfred Schmidt-Schauß Johann Wolfgang Goethe-Universit¨at Fachbereich Informatik und Mathematik Robert-Mayer-Str. 11-15 60325 Frankfurt Prof. David Sands, Ph.D. Chalmers University of Technology and University of G¨oteborg Department of Computing Science S-412 96 G¨oteborg Sweden Datum der Disputation: 17. August 2005 Zusammenfassung Gegenstand der vorliegenden Arbeit ist ein nicht-deterministischer Lambda- Kalkul¨ mit call-by-need Auswertung. Lambda-Kalkule¨ spielen im allgemeinen eine grundlegende Rolle in der theoretischen Informatik, sei es als Basis fur¨ Theorembeweiser im Bereich der Verifikation oder beim Entwurf sowie der se- mantischen Analyse von Programmiersprachen, insbesondere funktionalen. Im Zuge der weiter zunehmenden Komplexit¨at von Hard- und Softwaresyste- men werden funktionale Programmiersprachen eine immer gr¨oßere Bedeutung erlangen. Der Grund liegt darin, daß in der Softwareentwicklung stets eine ge- wisse Diskrepanz entsteht zwischen dem Konzept, welches ein Programmierer entwirft, und der Art und Weise, wie er es in einer bestimmten Programmier- sprache umzusetzen hat. Um robuste und verl¨aßliche Software zu gew¨ahrleisten, sollte es eines der Hauptanliegen der Informatik sein, diese Diskrepanz so klein wie m¨oglich zu halten. Im Vergleich zu imperativen oder objekt-orientierten ist dies bei funk- tionalen Programmiersprachen der Fall, weil sie aufgrund ihrer Herkunft aus der Semantik st¨arker am Ergebnis einer Berechnung orientiert sind. Imperati- ve bzw. objekt-orientierte Programmiersprachen hingegen betonen oft zu stark, welche Schritte notwendig sind, um das gewunsc¨ hte Ergebnis zu erhalten. Lambda-Kalkul¨ und funktionales Programmieren Obwohl der λ-Kalkul¨ die konzeptionelle Grundlage der funktionalen Sprachfa- milie bildet, spiegelt er die Implementierung moderner funktionaler Program- miersprachen nicht angemessen wider. Dies trifft besonders fur¨ die sogenannten verz¨ogert auswertenden funktionalen Programmiersprachen zu, d.h. solche, bei denen Argumente nur dann ausgewertet werden, wenn sie zur Ermittlung des Ergebnisses beitragen. Denn um eine Mehrfachauswertung bei der Anwendung i einer Funktion auf nicht vollst¨andig ausgewertete Argumente zu vermeiden, ver- wenden alle heutzutage relevanten Implementierungen Graphreduktion, d.h. sie operieren nicht auf einem Termbaum sondern auf einem Graphen. Ein Beispiel mag dies verdeutlichen: Die Ausdruc¨ ke λx.(x + x) und λx.(2x) stellen beide ei- ne Funktion dar, die ihr Argument x verdoppelt. Eine Anwendung auf z.B. das noch nicht vollst¨andig ausgewertete Argument (5 − 3) wurde¨ bei der ublic¨ hen call-by-name Auswertung, d.h. die Argumentausdruc¨ ke werden einfach fur¨ die Argumentvariablen eingesetzt, aber (5 − 3) + (5 − 3) bzw. 2(5 − 3) liefern. Offensichtlich wurde¨ dann das Ergebnis des Ausdruckes (5−3) bei der ersten Variante zwei Mal ermittelt, was ub¨ erflussig¨ ist, da das zweimalige Vorkommen des Ausdruckes (5 − 3) ein- und denselben Wert bezeichnet. Indem diese Infor- mation in einem Graphen gespeichert wird, umgeht die Implementierung diese Mehrfachberechnung. Dies ist aber nur zul¨assig, weil in funktionalen Program- miersprachen das Prinzip der referentiellen Transparenz erfullt¨ ist, d.h. ein Aus- druck ver¨andert w¨ahrend der Ausfuhrung¨ eines Programmes nicht seinen Wert1 sondern wird nur in eine andere, ¨aquivalente Form ub¨ erfuhrt.¨ Um die operationale Semantik einer solchen Implementierung mittels Gra- phreduktion entsprechend abzubilden, werden sogenannte call-by-need Kalkule¨ herangezogen. Bei diesen wird durch eine geschickte Wahl der Kalkulregeln¨ si- chergestellt, daß Ausdruc¨ ke erst kopiert werden k¨onnen, wenn sie in einer be- stimmten Form vorliegen, die man als Basiswert bezeichnen k¨onnte. Der Charak- ter der verz¨ogerten Auswertung bleibt hierbei freilich erhalten, d.h. eine Funk- tionsanwendung vermag sofort ein Ergebnis zu liefern, sobald alle relevanten Argumentausdruc¨ ke ausgewertet sind. Fur¨ das genannte Beispiel bedeutet das, daß die Anwendung von λx.(x + x) auf das Argument (5 − 3) zuerst in einen Ausdruck der Form let x = (5 − 3) in (x + x) ub¨ erfuhrt¨ wird, wobei das let- Konstrukt hier die Graphdarstellung, das Sharing, explizit darstellt. Erst wenn das Ergebnis 2 fur¨ den Term (5 − 3) ermittelt ist, wird es fur¨ x in den Rumpf x + x eingesetzt und somit die Mehrfachberechnung vermieden. Nichtdeterminismus und Gleichheit Die mitunter wichtigste Frage in einem Lambda-Kalkul,¨ nicht nur call-by-need betreffend, ist wohl, welche Arten von Termen kopiert werden durfen.¨ Wesent- lich beinflußt wird hierdurch auch die Frage, welche Programmtransformationen zul¨assig sind bzw. welche Gleichheiten auf Programmen gelten sollen. 1 Man beachte, daß diese Aussage fur¨ imperative Programmiersprachen nicht zutrifft. ii Wie das oben angefuhrte¨ Beispiel bereits verdeutlicht hat, sollten die beiden Ausdruc¨ ke λx.(x+x) und λx.(2x) als gleich betrachtet werden. Anderseits kann die im λ-Kalkul¨ ublic¨ he (β)-Regel nicht mehr uneingeschr¨ankt gelten. (λx.M)N = M[N=x] (β) Hierbei bezeichnet M[N=x] die Einsetzung des Argumentes N im Term M fur¨ die Variable x, was, wir im obigen Beispiel gesehen haben, zu Mehrfachauswer- tungen fuhren¨ kann. Wie aber l¨aßt sich nun ein ad¨aquater Gleichheitsbegriff finden, wenn doch auch der Ausdruck (5 − 3) + (5 − 3) dasselbe Ergebnis, n¨amlich 4, liefert? Es genugt¨ also offensichtlich nicht, nur die Ergebnisse von Auswertungen zu be- trachten. Einige Arbeiten verfolgen daher den Ansatz, die Anzahl der Auswer- tungsschritte zu z¨ahlen, was auch erfolgversprechend scheint. Diese Arbeit jedoch schl¨agt einen anderen Weg ein: Es wird ein Sprachkon- strukt pick eingefuhrt,¨ welches nichtdeterministisch eines seiner beiden Argu- menten ausw¨ahlt. Nichtdeterminismus in dieser Form verletzt offensichtlich die referentielle Transparenz, hilft aber im Gegenzug zu erkennen, wann Sharing erhalten bleibt bzw. wann nicht: Die beiden Ausdruc¨ ke pick 5 3 + pick 5 3 und let x = pick 5 3 in (x + x) liefern nun unterschiedliche Mengen von m¨oglichen Resultaten, n¨amlich f 6; 8; 10 g gegenub¨ er f 6; 10 g fur¨ den let-Ausdruck. Darub¨ erhinaus gibt es eine Vielzahl von Anwendungsgebieten fur¨ nichtde- terministische Lambda-Kalkule,¨ u.a. als theoretische Grundlage fur¨ nebenl¨aufige Prozesse oder von deklarativer, Seiteneffekt-behafteter Ein-/Ausgabe. Insbeson- dere im Bezug auf das letztere sind an der Professur schon einige Arbeiten durchgefuhrt¨ worden, z.B. die Dissertation von Arne Kutzner oder der tech- nische Bericht zum Kalkul¨ FUNDIO, zu denen sich diese als eine Erg¨anzung bzw. Fortfuhrung¨ versteht. Denn die Kenntnis ub¨ er die in einem Kalkul¨ geltenden Gleichheiten spielt ei- ne herausragende Rolle fur¨ dessen Verst¨andnis. Die bisherigen Arbeiten stutzen¨ sich dabei haupts¨achlich auf den Begriff der kontextuellen Gleichheit, durch die zwei Terme s und t miteinander identifiziert werden, wenn deren Auswertung in allen m¨oglichen Programmkontexten C[ ] dasselbe Terminierungsverhalten, gekennzeichnet mit +, zeigt: s 'c t () (8C : C[s] + () C[t] +) Die kontextuelle Gleichheit stellt ein m¨achtiges Werkzeug fur¨ die Beurteilung der Korrektheit von Programmtransformationen dar, insbesondere weil sie per iii Definition eine Kongruenz ist, d.h. ihre Gleichheiten wiederum in Kontexte ein- setzbar sind, also die Implikation s 'c t =) (8C : C[s] 'c C[t]) erfullt.¨ Mitunter sind Gleichheiten aber schwierig nachzuweisen, weil ja die Terminie- rung der Auswertung in allen m¨oglichen Kontexten betrachtet werden muß. Daher ist es oftmals sinnvoll, einen anderen Aquiv¨ alenzbegriff zu Hilfe zu nehmen. Die Rede ist von der Methode der Bisimulation, welche einen st¨arker stufenweisen Gleichheitstest beschreibt. Hier werden zwei Terme als gleich be- trachtet, wenn sie auf allen m¨oglichen Stufen dasselbe Verhalten zeigen, solange man also nicht das Gegenteil nachweisen kann. Die Technik fand zuerst im Be- reich der Zustandsub¨ ergangssysteme Anwendung, ist mittlerweile aber auf dem Gebiet der funktionalen Programmierung und Lambda-Kalkule¨ gut etabliert und existiert in verschiedenen Auspr¨agungen. Der Stil, den Abramsky appli- cative bisimulation nennt, ist am einfachsten in Form einer Pr¨aordnung . zu illustrieren. s . t () (s + λx.s0 =) (t + λx.t0 ^ 8r : (λx.s0) r . (λx.t0) r)) Dies ist nicht etwa eine zyklische Definition, sondern als Fixpunktgleichung zu verstehen, ub¨ er welcher der gr¨oßte Fixpunkt gebildet wird. Dadurch wird das Beweisprinzip der Co-Induktion anwendbar. Setzt man ∼ = . \ & so ist leicht zu sehen, daß ∼ ⊆ 'c gefolgert werden kann, vorausgesetzt ∼ ist eine Kongruenz. Gilt die Einsetzbarkeit in Kontexte fur¨ eine Pr¨aordnung, so wird von einer Pr¨akongruenz gesprochen. Es ist also zu zeigen, daß die Relation ., genannt similarity, eine Pr¨akongruenz ist. Nicht nur dieser Beweis sondern bereits der Entwurf der passenden Relation fur¨ einen nichtdeterministischen call-by-need Lambda-Kalkul¨ stellt einen wesentlichen Er- kenntnisgewinn dar. Denn der ublic¨ he Ansatz fur¨ die Similarity-Definition
Recommended publications
  • Head-Order Techniques and Other Pragmatics of Lambda Calculus Graph Reduction
    HEAD-ORDER TECHNIQUES AND OTHER PRAGMATICS OF LAMBDA CALCULUS GRAPH REDUCTION Nikos B. Troullinos DISSERTATION.COM Boca Raton Head-Order Techniques and Other Pragmatics of Lambda Calculus Graph Reduction Copyright © 1993 Nikos B. Troullinos All rights reserved. No part of this book may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage and retrieval system, without written permission from the publisher. Dissertation.com Boca Raton, Florida USA • 2011 ISBN-10: 1-61233-757-0 ISBN-13: 978-1-61233-757-9 Cover image © Michael Travers/Cutcaster.com Abstract In this dissertation Lambda Calculus reduction is studied as a means of improving the support for declarative computing. We consider systems having reduction semantics; i.e., systems in which computations consist of equivalence-preserving transformations between expressions. The approach becomes possible by reducing expressions beyond weak normal form, allowing expression-level output values, and avoiding compilation-centered transformations. In particular, we develop reduction algorithms which, although not optimal, are highly efficient. A minimal linear notation for lambda expressions and for certain runtime structures is introduced for explaining operational features. This notation is related to recent theories which formalize the notion of substitution. Our main reduction technique is Berkling’s Head Order Reduction (HOR), a delayed substitution algorithm which emphasizes the extended left spine. HOR uses the de Bruijn representation for variables and a mechanism for artificially binding relatively free variables. HOR produces a lazy variant of the head normal form, the natural midway point of reduction. It is shown that beta reduction in the scope of relative free variables is not hard.
    [Show full text]
  • Formalisation and Execution of Linear Algebra: Theorems and Algorithms
    Formalisation and execution of Linear Algebra: theorems and algorithms Jose Divas´onMallagaray Dissertation submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy Supervisors: Dr. D. Jes´usMar´ıaAransay Azofra Dr. D. Julio Jes´usRubio Garc´ıa Departamento de Matem´aticas y Computaci´on Logro~no,June 2016 Examining Committee Dr. Francis Sergeraert (Universit´eGrenoble Alpes) Prof. Dr. Lawrence Charles Paulson (University of Cambridge) Dr. Laureano Lamb´an (Universidad de La Rioja) External Reviewers Dr. Johannes H¨olzl (Technische Universit¨atM¨unchen) Ass. Prof. Dr. Ren´eThiemann (Universit¨atInnsbruck) This work has been partially supported by the research grants FPI-UR-12, ATUR13/25, ATUR14/09, ATUR15/09 from Universidad de La Rioja, and by the project MTM2014-54151-P from Ministerio de Econom´ıay Competitividad (Gobierno de Espa~na). Abstract This thesis studies the formalisation and execution of Linear Algebra algorithms in Isabelle/HOL, an interactive theorem prover. The work is based on the HOL Multivariate Analysis library, whose matrix representation has been refined to datatypes that admit a representation in functional programming languages. This enables the generation of programs from such verified algorithms. In par- ticular, several well-known Linear Algebra algorithms have been formalised in- volving both the computation of matrix canonical forms and decompositions (such as the Gauss-Jordan algorithm, echelon form, Hermite normal form, and QR decomposition). The formalisation of these algorithms is also accompanied by the formal proofs of their particular applications such as calculation of the rank of a matrix, solution of systems of linear equations, orthogonal matrices, least squares approximations of systems of linear equations, and computation of determinants of matrices over B´ezoutdomains.
    [Show full text]
  • Semantic Audio Analysis Utilities and Applications
    Semantic Audio Analysis Utilities and Applications PhD Thesis Gyorgy¨ Fazekas Centre for Digital Music School of Electronic Engineering and Computer Science, Queen Mary University of London April 2012 I certify that this thesis, and the research to which it refers, are the product of my own work, and that any ideas or quotations from the work of other people, published or otherwise, are fully acknowledged in accordance with the standard referencing practices of the discipline. I acknowledge the helpful guidance and support of my supervisor, Professor Mark Sandler. Abstract Extraction, representation, organisation and application of metadata about audio recordings are in the concern of semantic audio analysis. Our broad interpretation, aligned with re- cent developments in the field, includes methodological aspects of semantic audio, such as those related to information management, knowledge representation and applications of the extracted information. In particular, we look at how Semantic Web technologies may be used to enhance information management practices in two audio related areas: music informatics and music production. In the first area, we are concerned with music information retrieval (MIR) and related research. We examine how structured data may be used to support reproducibility and provenance of extracted information, and aim to support multi-modality and context adap- tation in the analysis. In creative music production, our goals can be summarised as follows: O↵-the-shelf sound editors do not hold appropriately structured information about the edited material, thus human-computer interaction is inefficient. We believe that recent developments in sound analysis and music understanding are capable of bringing about significant improve- ments in the music production workflow.
    [Show full text]
  • Exploring the Semantic Gap in Compiling Embedded Dsls
    Exploring the Semantic Gap in Compiling Embedded DSLs Peter Zangerl, Herbert Jordan, Peter Thoman, Philipp Gschwandtner and Thomas Fahringer University of Innsbruck, 6020 Innsbruck, Austria Email: {peterz,herbert,petert,philipp,tf}@dps.uibk.ac.at Abstract—Optimizing compilers provide valuable contributions of computer science research and applications, from high- to the quality of processed code. The vast majority of applica- performance computing [2] all the the way to embedded tion developers rely on those capabilities to obtain binary code systems [3]. In fact, the increasing prevalence and success efficiently utilizing processing resources. However, compiler of EDSLs has in turn influenced programming language proficiency is frequently misjudged by application develop- design, with mainstream languages adopting features which ers. While for some constellations the effectiveness of those facilitate more powerful and natural EDSLs, such as meta- optimizations is grossly underestimated, for others, mostly programming, higher-order functions and closures, or reflec- involving higher-level semantic concepts of embedded DSLs, tion [4], [5], [6]. the compilers’ influence on the code quality tends to disappoint. However, embedded DSLs present unique challenges to In this paper, we provide examples for the effectiveness modern optimizing compilers, which may lead practitioners and ineffectiveness of state-of-the-art optimizing compilers in to simultaneously over- and underestimate their optimization improving application code. Based on those observations we capabilities. This situation can result in sub-optimal code — characterize the differences between positive and negative either in terms of productivity, as low-level optimizations examples and provide an in-depth explanation for the short- that are easily automated are hardcoded, or in terms of per- comings of optimizing compilers.
    [Show full text]
  • Applications of Languages with Self-Interpreters to Partial Terms and Functional Programming
    International Journal on Advances in Intelligent Systems, vol 7 no 1 & 2, year 2014, http://www.iariajournals.org/intelligent_systems/ 74 Applications of Languages with Self-Interpreters to Partial Terms and Functional Programming Lev Naiman Department of Computer Science University of Toronto Toronto, Canada Email: [email protected] Abstract —Those programming languages that contain self- terminating programs from its standard theory and require a interpreters have the added power of reflection, and allow dynam- proof of termination for all programs. In the Vienna Devel- ically controlling execution. In a logical language a complete self- opment Method [4] non-terminating computation corresponds interpreter is necessarily inconsistent. However, we demonstrate a logical language with a reasonably complete self-interpreter. to partial functions. In a well-typed or dependently typed We argue for its use as a simple formalism for reasoning language this partiality naturally arises when a function is about partial terms, and functional languages that allow both required to produce a certain result for a pre-condition stronger general recursion and dependent types. Since refinements of than true. programming specifications often include partial terms, they need The occurrence of partial terms are not limited to non- to be handled using formal rules. Likewise, we show formal rules for handling general recursion consistently in a simple language. terminating functions. There are often cases where expressions Moreover, we demonstrate how to use an interpreter to reason within programs do not denote a value. For example the about lazy evaluation. We argue that the interpreter can be indexing of a sequence with a negative number results in integrated within theorem provers.
    [Show full text]
  • A Modular Monadic Action Semantics
    The following paper was originally published in the Proceedings of the Conference on Domain-Specific Languages Santa Barbara, California, October 1997 A Modular Monadic Action Semantics Keith Wansbrough and John Hamer University of Auckland For more information about USENIX Association contact: 1. Phone: 510 528-8649 2. FAX: 510 548-5738 3. Email: [email protected] 4. WWW URL:http://www.usenix.org A Modular Monadic Action Semantics Keith Wansbrough John Hamer Department of Computer Science University of Auckland Auckland, New Zealand g fkeith-w,j-hamer @cs.auckland.ac.nz Abstract clearly and precisely defined; modular enough to change A domain-specific language (DSL) is a framework incrementally; and amenable to formal reasoning. Differ- which is designed to precisely meet the needs of a particu- ent tools exist that address these varied requirements, but lar application. Domain-specific languages exist for a vari- few make any attempt to address them all. Compiler gener- ety of reasons. As productivity tools, they are used to make ators provide efficient implementations, but they typically application prototyping and development faster and more have weak formal properties. We do not consider such sys- robust in the presence of evolving requirements. Further- tems further in this paper. Semantic formalisms, such as more, by bridging the “semantic gap” between an applica- denotational semantics and structural operational seman- tion domain and program code, DSLs increase the oppor- tics, have richly developed theories but are difficult to use tunity to apply formal methods in proving properties of an and lack modularity. application. The conceptual distance between the high-level language In this paper, we contribute a synthesis of two existing and established semantic formalisms is huge.
    [Show full text]
  • Theory and Implementation of Object Oriented Semantic Web Language
    THEORY AND IMPLEMENTATION OF OBJECT ORIENTED SEMANTIC WEB LANGUAGE A DISSERTATION SUBMITTED TO THE DEPARTMENT OF INFORMATICS AND THE COMMITTEE ON GRADUATE STUDIES OF THE GRADUATE UNIVERSITY FOR ADVANCED STUDIES (SOKENDAI) IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY Seiji Koide 2010 ⃝c Copyright by Seiji Koide 2010 All Rights Reserved ii MEMBER OF DOCTORAL COMMITTEE Hideaki Takeda National Institute of Informatics Ken Satoh National Institute of Informatics Nigel Collier National Institute of Informatics Ikki Ohmukai National Institute of Informatics Takahira Yamaguchi Keio University Approved for the University Committee on Graduate Studies. iii Abstract Resource Description Framework (RDF for short) is an assertional language intended to be used to express propositions using precise formal vocabularies, and its syntax is applicable to the World Wide Webs. RDF Schema (RDFS) is a semantic extension of RDF and it provides a minimum type system to describe web ontologies. OWL Web Ontology Language is a language for defining and instantiating Web ontologies. These three languages for Semantic Webs are intended to be integrated in Semantic Web Layered Architecture, namely OWL was designed to be realized on top of RDF and RDFS. However, this intention is not accomplished and it seems to be coming apart more and more. The objective of this doctoral study is recovering the language integration and provides a unified language system for Semantic Webs. In this study, firstly, the semantics of RDF, RDFS, and OWL are investigated in common de- scription based on Tarskian denotational semantics, whereas the formal way of describing semantics in W3C Recommendations is different between RDF(S) and OWL.
    [Show full text]
  • Qian Wang Ph.D. Dissertation
    A SEMANTICS-BASED APPROACH TO PROCESSING FORMAL LANGUAGES by Qian Wang APPROVED BY SUPERVISORY COMMITTEE: ___________________________________________ Gopal Gupta, Chair ___________________________________________ Dung T. Huynh ___________________________________________ Lawrence Chung ____________________________________________ Jing Dong Copyright 2007 Qian Wang All Rights Reserved To my wife Haiying, lovely daughter Selina A SEMANTICS-BASED APPROACH TO PROCESSING FORMAL LANGUAGES by QIAN WANG, B.S., M.S. DISSERTATION Presented to the Faculty of The University of Texas at Dallas in Partial Fulfillment of the Requirements for the Degree of DOCTOR OF PHILOSOPHY IN COMPUTER SCIENCE THE UNIVERSITY OF TEXAS AT DALLAS December, 2007 ACKNOWLEDGEMENTS I am deeply indebted to my supervisor, Professor Gopal Gupta, for his guidance, insight and encouragement through the course of doctoral program and for his careful reading and constructive criticisms and suggestions on the dissertation and research work. I would like to thank members of my committee for agreeing to serve in the committee and for their feedback. I am also grateful to the many anonymous referees who have reviewed parts of this work prior to publication in conference proceedings and whose valuable comments have contributed to the clarifications of the ideas presented in this thesis. While doing my research work I have been supported by grants from the National Science Foundation, the U.S. Department of Education and the Environmental Protection Agency. I have also received support from the Erik Jonsson School of Engineering and Computer Science. For all this support, I am very grateful. I would also like to acknowledge the help of Dan Sahlin of the Swedish Institute of Computer Science, Ericsson, the author of Mixtus Partial Evaluation System, and Michael Leuschel of the University of Dusseldorf and the author of LOGEN Partial Evaluation system.
    [Show full text]
  • Language Based Software Engineering
    Language Based Software Engineering Gopal Gupta Applied Logic, Programming Languages and Systems Lab Department of Computer Science University of Texas at Dallas Richardson, TX, USA 95083 [email protected] The three most important things in software development: notation, notation, notation. 1 Domain Specific Languages: Software as a Language Processor Techniques for developing reliable, error free (and these days also secure) software have been a subject of research for a long time. A number of software engineering frameworks have been proposed, from the waterfall model [1] to aspect oriented frameworks [2]. In this short paper we present a language-based framework for software engineering [10]. In the language-based software engineering framework domain experts are the application developers. They program their applications in high level domain specific languages. This approach, based on using high level DSLs, results in more robust, reliable and secure software. The task of developing a program to solve a specific problem involves two steps. The first step is to devise a solution procedure to solve the problem. This step requires a domain expert to use his/her domain knowledge, expertise, creativity and mental acumen to devise a solution to the problem. The second step is to code the solution in some executable notation (such as a computer programming language) to obtain a program that can then be run on a computer to solve the problem. In the second step the user is required to map the steps of the solution procedure to constructs of the programming language being used for coding. Both steps are cognitively challenging and require considerable amount of thinking and mental activity.
    [Show full text]
  • Exploring the Semantic Gap in Compiling Embedded Dsls
    Exploring the Semantic Gap in Compiling Embedded DSLs Peter Zangerl, Herbert Jordan, Peter Thoman, Philipp Gschwandtner and Thomas Fahringer {peterz,herbert,petert,philipp,tf}@dps.uibk.ac.at University of Innsbruck ABSTRACT well-established and broadly supported, while offering sufficient Optimizing compilers provide valuable contributions to the qual- flexibility to naturally express the concepts of the domain inques- ity of processed code. The vast majority of application developers tion. EDSLs are then implemented as libraries in their host language, rely on those capabilities to obtain binary code efficiently utilizing allowing them to fully utilize the existing tooling of this host lan- processing resources. However, compiler proficiency is frequently guage, including e.g. development environments, debuggers, and — misjudged by application developers. While for some constellations most crucially for our exploration in this paper — highly sophisti- the effectiveness of those optimizations is grossly underestimated, cated optimizing compilers. for others, mostly involving higher-level semantic concepts of em- On paper, therefore, EDSLs combine the best of both worlds: bedded DSLs, the compilers’ influence on the code quality tends to offering a highly productive interface to domain experts, without disappoint. much initial implementation effort, while leveraging the full po- In this paper, we provide examples for the effectiveness and tential of modern mainstream language toolchains. And to a large ineffectiveness of state-of-the-art optimizing compilers in improv- extent this holds true in practice, with EDSLs spreading rapidly ing application code. Based on those observations we characterize across many distinct fields of computer science research and ap- the differences between positive and negative examples and pro- plications, from high-performance computing [11] all the the way vide an in-depth explanation for the short-comings of optimizing to embedded systems [3].
    [Show full text]
  • The Nature of Computing
    The Nature of Computing ICOM 4036 Lecture 2 Prof. Bienvenido Velez Spring 2006 ICOM 4036 Programming Laguages 1 Lecture 2 Some Inaccurate Yet Popular Perceptions of Computing • Computing = Computers • Computing = Programming • Computing = Software Spring 2006 ICOM 4036 Programming Laguages 2 Lecture 2 Computing = Computers Computing is about solving problems using computers A.K.A. The Computing Device View of Computing Spring 2006 ICOM 4036 Programming Laguages 3 Lecture 2 Computing = Programming Computing is about writing programs for computers A.K.A. The Programming Language view of Computing Spring 2006 ICOM 4036 Programming Laguages 4 Lecture 2 Computing = Software Computing is not concerned with hardware design A.K.A. The “Floppy Disk” view of Computing Spring 2006 ICOM 4036 Programming Laguages 5 Lecture 2 Part I - Outline • What is Computing? • Computing Models and Computability • Interpretation and Universal Computers • Church’s Thesis Spring 2006 ICOM 4036 Programming Laguages 6 Lecture 2 What is computing then? Algorithmic Input Output Information Computation Information Function Computing is the study of Computation: the process of transforming information Spring 2006 ICOM 4036 Programming Laguages 7 Lecture 2 The Computation Process encode compute decode 0110110 0110110 0101010 0101010 0101… 0101… Information Information Problem Solution Spring 2006 ICOM 4036 Programming Laguages 8 Lecture 2 Fundamental Questions Addressed by the Discipline of Computing • What is the nature of computation? • What can be computed? • What can
    [Show full text]
  • Digests Compilation
    I-MEDIA ’07 I-SEMANTICS ’07 International Conferences on New Media Technology and Semantic Systems Preface 1 K. Tochtermann, W. Haas, F. Kappe, A. Scharl, T. Pellegrini, S. Schaffert Social Media Maps of Online Communities 5 M. Smith Community Equity – How to Implement and Measure a Social 6 Capital System in Enterprise Communities P. Reiser Collaborative Knowledge Visualization: Researching and 7 Augmenting Visual Practices in Knowledge Work M. Eppler I-MEDIA ’07 I-MEDIA '07 8 K. Tochtermann, W. Haas, F. Kappe, A. Scharl Media Convergence Perceived Simultaneous Consumption of Media Content 9 Services among Media Aware University Students E. Appelgren A Semantics-aware Platform for Interactive TV Services 17 A. Papadimitriou, C. Anagnostopoulos, V. Tsetsos, S. Paskalis, S. Hadjieftymiades Evaluation of available MPEG-7 Annotation Tools 25 M. Döller, N. Lefin Media Semantics Applying Media Semantics Mapping in a 33 Non-linear,Interactive Movie Production Environment M. Hausenblas Imagesemantics: User-Generated Metadata, Content Based 41 Retrieval & Beyond M. Spaniol, R. Klamma, M. Lux The Need for Formalizing Media Semantics in the Games and 49 Entertainment Industry T. Bürger, H. Zeiner Social Networks Quantitative Analysis of Success Factors for User Generated 57 Content C. Safran, F. Kappe Online Crowds – Extraordinary Mass Behavior on the Internet 65 C. Russ WordFlickr: A Solution to the Vocabulary Problem in Social 77 Tagging Systems J. Kolbitsch Business Models The Three Pillars of ‘Corporate Web 2.0’: A Model for 85 Definition A. Stocker, A. Us Saeed, G. Dösinger, C. Wagner A Theory of Co-Production for User Generated Content 93 –Integrating the User into the Content Value Chain T.
    [Show full text]