Multiparadigm Programming with Object-Oriented Languages

Total Page:16

File Type:pdf, Size:1020Kb

Multiparadigm Programming with Object-Oriented Languages NIC John von Neumann Institute for Computing Publication Series of the Jörg Striegnitz, Kei Davis, John von Neumann Institute for Computing (NIC) Yannis Smaragdakis (Eds.) The John von Neumann Institute for Computing was established in 1998 by the Research Centre Jülich and the German Electron Synchrotron Foundation DESY in order to support supercomputer-aided scientific and engineering research and development in Germany with the following tasks: Multiparadigm • Nationwide provision of supercomputer capacity for projects in science, Programming with research, and industry. Object-Oriented Languages • Supercomputer-oriented research and development by research groups of competence in supercomputing applications. • Education and training in the fields of supercomputing by symposia, workshops, summer schools, seminars, and courses. Multiparadigm Programming with Object-Oriented Languages Proceedings Forschungszentrum Jülich Deutsches in der Helmholtz-Gemeinschaft Elektronen-Synchrotron J. Striegnitz, K. Davis, Y. Smaragdakis NIC Series Volume 13 ISBN 3-00-009099-1 13 Central Institute for Applied Mathematics Publication Series of the John von Neumann Institute for Computing (NIC) NIC Series Volume 13 John von Neumann Institute for Computing (NIC) Jorg¨ Striegnitz, Kei Davis, Yannis Smaragdakis (Eds.) Multiparadigm Programming with Object-Oriented Languages (MPOOL) 2nd International Workshop, 11 June 2002 Malaga, Spain Proceedings organized by John von Neumann Institute for Computing in cooperation with Los Alamos National Laboratory, New Mexico, USA Georgia Institute of Technology, Georgia, USA NIC Series Volume 13 ISBN 3-00-009099-1 Die Deutsche Bibliothek – CIP-Cataloguing-in-Publication-Data A catalogue record for this publication is available from Die Deutsche Bibliothek. Publisher: NIC-Directors Distributor: NIC-Secretariat Research Centre Julich¨ 52425 Julich¨ Germany Internet: www.fz-juelich.de/nic Printer: Graphische Betriebe, Forschungszentrum Julich¨ c 2002 by John von Neumann Institute for Computing Permission to make digital or hard copies of portions of this work for personal or classroom use is granted provided that the copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise requires prior specific permission by the publisher mentioned above. NIC Series Volume 13 ISBN 3-00-009099-1 Preface This volume contains the proceedings of the International Workshop on Multi- paradigm Programming (MPOOL'02), held in Malaga, Spain on June 11, 2002. Although the idea of combining programming paradigms and languages goes back to the late sixties, when a lot of research effort was spent on the development and investigation of extensible programming languages, this approach has lost neither its importance, nor its elegance. Programming languages and paradigms are thought models. Their distin- guishing concepts have a great influence on how programmers approach the different stages of the software development process. With respect to software quality, it is therefore desirable to let the problem domain determine the choice of the programming paradigm. Especially for larger software projects, this implies the need for languages and tools that support the simultaneous use of different programming paradigms. Today the object oriented programming paradigm is dominant and ubiqui- tously employed for design, implementation and even conceptualization and a huge set of tools has been developed and successfully applied over the last two decades. MPOOL tries to bring together people who are trying to build a bridge from OO-centered tools to a toolset that permits free choice of paradigms. Last year's MPOOL gave evidence that there exists a larger community work- ing in this emerging area. The extended diversity of topics of this year's work- shop shows that there is an ongoing advance in programming languages, tools, concepts and methodologies to support multiparadigm programming. One of the main goals of the workshop (and the reason of publishing this proceedings volume) is to promote and expose work that combines programming paradigms in the framework of OO languages. Building a consensus about the standard background work, the interesting problems, and the future directions is the way to form a community. Acknowledgment. We wish to heartily thank all the authors for writing very interesting papers and all the members of the Programme Committee for their invaluable help in compiling the program. June 2002 Kei Davis Yannis Smaragdakis J¨org Striegnitz Workshop Organizers Kei Davis, Los Alamos National Laboratories, New Mexico, USA Yannis Smaragdakis, Georgia Institute of Technology, Georgia, USA J¨org Striegnitz, John von Neumann Institute for Computing, Germany Programme Committee Gerald Baumgartner, Ohio State University, Ohio, USA Kei Davis, Los Alamos National Laboratory, New Mexico, USA Jaakko J¨arvi, Indiana University, Indiana, USA Peter Van Roy, Universite catholique de Louvain, Belgium Yannis Smaragdakis, Georgia Institute of Technology, Georgia, USA J¨org Striegnitz, John von Neumann Institute for Computing, Germany Table of Contents Object Programming in a Rule-Based Language with Strategies : : : : : : : : : : 1 Hubert Dubois, H´el`ene Kirchner Modelica - A Declarative Object Oriented Multi-Paradigm Language.: : : : : 27 Peter Fritzson, Peter Bunus The Return of Jensen's Device : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 45 Timothy A. Budd Towards Linguistic Symbiosis of an Object-Oriented and a Logic Program- ming Language : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 65 Johan Brichau, Kris Gybels, Roel Wuyts Replacing Refinement or Refining Replacement? : : : : : : : : : : : : : : : : : : : : : : : 79 Sibylle Schupp Java-Style Variable Binding in C++ : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 97 Thomas Becker Dynamic Inheritance for a Prototype-based Language : : : : : : : : : : : : : : : : : : 109 Koji Kagawa A case in Multiparadigm Programming : User Interfaces by means of Declarative Meta Programming : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 121 S. Goderis, W. De Meuter, J. Brichau iv Object Programming in a Rule-Based Language with Strategies Hubert Dubois, H´el`ene Kirchner LORIA-Universit´e Nancy 2 & LORIA-CNRS BP 239 54506 Vandœuvre-l`es-Nancy Cedex, France fHubert.Dubois|[email protected] Abstract. This paper presents a programming framework that com- bines the concepts of objects, rules and strategies, built as an extension of the rule-based language with strategies ELAN. This extension is imple- mented in a reflective way in ELAN itself and relies on the same formal semantics, namely the ρ-calculus. 1 Introduction Object-based languages and rule-based languages have independently emerged as programming paradigms in the eighties. Languages like Ada [Ros92], Smalltalk- 80 [GR83], Eiffel [Mey92] or Java [AG96] are well-known and largely used but often lack of semantical basis. Rule-based systems, initially used in the artificial intelligence community, have gained interest with the development of efficient compilers. The ELAN system [BCD+00] provides a very general approach to rule-based programming. ELAN offers an environment for specifying and prototyping de- duction systems in a language based on rewrite rules controlled by strategies. It gives a natural and simple logical framework for the combination of computa- tion and deduction paradigms. It supports the design of theorem provers, logic programming languages, constraint solvers and decision procedures and offers a modular framework for studying their combination. ELAN has a clear operational semantics based on rewriting logic [BKKR01] and on rewriting calculus [Cir00]. Its implementation involves compiled matching and reduction techniques inte- grating associative and commutative functions. Non deterministic computations return several results whose enumeration is handled thanks to a few constructors of choice strategies. A strategy language is available to control rewriting. Evalu- ation of strategies and strategy application is again based on rewriting. However, ELAN lack object oriented features, the notion of states, which provides more structuration, and the ability to define structures sharing the same properties. More recently, the combination of object-based languages and rule-based lan- guages has proved to be quite relevant to formalize and solve advanced industrial problems that require some form of reasoning. Among many other languages, let us mention three of them, more closely related to our approach: CLAIRE [CL96], Oz [HSW95] and Maude [CDE+00]. 1 CLAIRE is combines objects and propagation rules in the logic programming style for problem solving and combinatorial optimization. Single inheritance and full polymorphism are supported. CLAIRE has been realized to deal with appli- cations with complex data structures with the ability to define rules. CLAIRE rules associate a logical condition with an expression composed by one or two objects: when a condition is evaluated into true, the expression is evaluated for the given objects. Conditions are events that denote an evolution of an entity (creation of an object, modification of an attribute, etc...). Rules can be grouped together but control is not explicit. Oz is a concurrent object oriented constraint language that offers multiple inheritance, higher-order
Recommended publications
  • Ece351 Lab Manual
    DEREK RAYSIDE & ECE351 STAFF ECE351 LAB MANUAL UNIVERSITYOFWATERLOO 2 derek rayside & ece351 staff Copyright © 2014 Derek Rayside & ECE351 Staff Compiled March 6, 2014 acknowledgements: • Prof Paul Ward suggested that we look into something with vhdl to have synergy with ece327. • Prof Mark Aagaard, as the ece327 instructor, consulted throughout the development of this material. • Prof Patrick Lam generously shared his material from the last offering of ece251. • Zhengfang (Alex) Duanmu & Lingyun (Luke) Li [1b Elec] wrote solutions to most labs in txl. • Jiantong (David) Gao & Rui (Ray) Kong [3b Comp] wrote solutions to the vhdl labs in antlr. • Aman Muthrej and Atulan Zaman [3a Comp] wrote solutions to the vhdl labs in Parboiled. • TA’s Jon Eyolfson, Vajih Montaghami, Alireza Mortezaei, Wenzhu Man, and Mohammed Hassan. • TA Wallace Wu developed the vhdl labs. • High school students Brian Engio and Tianyu Guo drew a number of diagrams for this manual, wrote Javadoc comments for the code, and provided helpful comments on the manual. Licensed under Creative Commons Attribution-ShareAlike (CC BY-SA) version 2.5 or greater. http://creativecommons.org/licenses/by-sa/2.5/ca/ http://creativecommons.org/licenses/by-sa/3.0/ Contents 0 Overview 9 Compiler Concepts: call stack, heap 0.1 How the Labs Fit Together . 9 Programming Concepts: version control, push, pull, merge, SSH keys, IDE, 0.2 Learning Progressions . 11 debugger, objects, pointers 0.3 How this project compares to CS241, the text book, etc. 13 0.4 Student work load . 14 0.5 How this course compares to MIT 6.035 .......... 15 0.6 Where do I learn more? .
    [Show full text]
  • Programming Language
    Programming language A programming language is a formal language, which comprises a set of instructions that produce various kinds of output. Programming languages are used in computer programming to implement algorithms. Most programming languages consist of instructions for computers. There are programmable machines that use a set of specific instructions, rather than general programming languages. Early ones preceded the invention of the digital computer, the first probably being the automatic flute player described in the 9th century by the brothers Musa in Baghdad, during the Islamic Golden Age.[1] Since the early 1800s, programs have been used to direct the behavior of machines such as Jacquard looms, music boxes and player pianos.[2] The programs for these The source code for a simple computer program written in theC machines (such as a player piano's scrolls) did not programming language. When compiled and run, it will give the output "Hello, world!". produce different behavior in response to different inputs or conditions. Thousands of different programming languages have been created, and more are being created every year. Many programming languages are written in an imperative form (i.e., as a sequence of operations to perform) while other languages use the declarative form (i.e. the desired result is specified, not how to achieve it). The description of a programming language is usually split into the two components ofsyntax (form) and semantics (meaning). Some languages are defined by a specification document (for example, theC programming language is specified by an ISO Standard) while other languages (such as Perl) have a dominant implementation that is treated as a reference.
    [Show full text]
  • Safe, Fast and Easy: Towards Scalable Scripting Languages
    Safe, Fast and Easy: Towards Scalable Scripting Languages by Pottayil Harisanker Menon A dissertation submitted to The Johns Hopkins University in conformity with the requirements for the degree of Doctor of Philosophy. Baltimore, Maryland Feb, 2017 ⃝c Pottayil Harisanker Menon 2017 All rights reserved Abstract Scripting languages are immensely popular in many domains. They are char- acterized by a number of features that make it easy to develop small applications quickly - flexible data structures, simple syntax and intuitive semantics. However they are less attractive at scale: scripting languages are harder to debug, difficult to refactor and suffers performance penalties. Many research projects have tackled the issue of safety and performance for existing scripting languages with mixed results: the considerable flexibility offered by their semantics also makes them significantly harder to analyze and optimize. Previous research from our lab has led to the design of a typed scripting language built specifically to be flexible without losing static analyzability. Inthis dissertation, we present a framework to exploit this analyzability, with the aim of producing a more efficient implementation Our approach centers around the concept of adaptive tags: specialized tags attached to values that represent how it is used in the current program. Our frame- work abstractly tracks the flow of deep structural types in the program, and thuscan ii ABSTRACT efficiently tag them at runtime. Adaptive tags allow us to tackle key issuesatthe heart of performance problems of scripting languages: the framework is capable of performing efficient dispatch in the presence of flexible structures. iii Acknowledgments At the very outset, I would like to express my gratitude and appreciation to my advisor Prof.
    [Show full text]
  • Dynamic Extension of Typed Functional Languages
    Dynamic Extension of Typed Functional Languages Don Stewart PhD Dissertation School of Computer Science and Engineering University of New South Wales 2010 Supervisor: Assoc. Prof. Manuel M. T. Chakravarty Co-supervisor: Dr. Gabriele Keller Abstract We present a solution to the problem of dynamic extension in statically typed functional languages with type erasure. The presented solution re- tains the benefits of static checking, including type safety, aggressive op- timizations, and native code compilation of components, while allowing extensibility of programs at runtime. Our approach is based on a framework for dynamic extension in a stat- ically typed setting, combining dynamic linking, runtime type checking, first class modules and code hot swapping. We show that this framework is sufficient to allow a broad class of dynamic extension capabilities in any statically typed functional language with type erasure semantics. Uniquely, we employ the full compile-time type system to perform run- time type checking of dynamic components, and emphasize the use of na- tive code extension to ensure that the performance benefits of static typing are retained in a dynamic environment. We also develop the concept of fully dynamic software architectures, where the static core is minimal and all code is hot swappable. Benefits of the approach include hot swappable code and sophisticated application extension via embedded domain specific languages. We instantiate the concepts of the framework via a full implementation in the Haskell programming language: providing rich mechanisms for dy- namic linking, loading, hot swapping, and runtime type checking in Haskell for the first time. We demonstrate the feasibility of this architecture through a number of novel applications: an extensible text editor; a plugin-based network chat bot; a simulator for polymer chemistry; and xmonad, an ex- tensible window manager.
    [Show full text]
  • The Origins of Word Processing and Office Automation
    Remembering the Office of the Future: The Origins of Word Processing and Office Automation Thomas Haigh University of Wisconsin Word processing entered the American office in 1970 as an idea about reorganizing typists, but its meaning soon shifted to describe computerized text editing. The designers of word processing systems combined existing technologies to exploit the falling costs of interactive computing, creating a new business quite separate from the emerging world of the personal computer. Most people first experienced word processing using a word processor, we think of a software as an application of the personal computer. package, such as Microsoft Word. However, in During the 1980s, word processing rivaled and the early 1970s, when the idea of word process- eventually overtook spreadsheet creation as the ing first gained prominence, it referred to a new most widespread business application for per- way of organizing work: an ideal of centralizing sonal computers.1 By the end of that decade, the typing and transcription in the hands of spe- typewriter had been banished to the corner of cialists equipped with technologies such as auto- most offices, used only to fill out forms and matic typewriters. The word processing concept address envelopes. By the early 1990s, high-qual- was promoted by IBM to present its typewriter ity printers and powerful personal computers and dictating machine division as a comple- were a fixture in middle-class American house- ment to its “data processing” business. Within holds. Email, which emerged as another key the word processing center, automatic typewriters application for personal computers with the and dictating machines were rechristened word spread of the Internet in the mid-1990s, essen- processing machines, to be operated by word tially extended word processing technology to processing operators rather than secretaries or electronic message transmission.
    [Show full text]
  • The University of Chicago Reflective Techniques In
    THE UNIVERSITY OF CHICAGO REFLECTIVE TECHNIQUES IN EXTENSIBLE LANGUAGES A DISSERTATION SUBMITTED TO THE FACULTY OF THE DIVISION OF THE PHYSICAL SCIENCES IN CANDIDACY FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF COMPUTER SCIENCE BY JONATHAN RIEHL CHICAGO, ILLINOIS AUGUST 2008 To Leon Schram ABSTRACT An extensible programming language allows programmers to use the language to modify one or more of the language’s syntactic, static-semantic, and/or dynamic- semantic properties. This dissertation presents Mython, a variant of the Python lan- guage, which affords extensibility of all three language properties. Mython achieves extensibility through a synthesis of reflection, staging, and compile-time evaluation. This synthesis allows language embedding, language evolution, domain-specific opti- mization, and tool development to be performed in the Mython language. This work argues that using language-development tools from inside an extensible language is preferable to using external tools. The included case study shows that users of an embedded differential equation language are able to work with both the embedded language and embedded programs in an interactive fashion — simplifying their work flow, and the task of specifying the embedded language. iii ACKNOWLEDGMENTS In keeping with my dedication, I’d like to begin by thanking Leon Schram, my A.P. computer science teacher. Mr. Schram took a boy who liked to play with computers, and made him a young man that could apply reason to the decomposition and solution of problems. In turn, I’d like to thank Charlie Fly and Robin Friedrich for helping extend that reason to encompass large scale systems and language systems.
    [Show full text]
  • Metaprogramming in .NET by Kevin Hazzard Jason Bock
    S AMPLE CHAPTER in .NET Kevin Hazzard Jason Bock FOREWORD BY Rockford Lhotka MANNING Metaprogramming in .NET by Kevin Hazzard Jason Bock Chapter 1 Copyright 2013 Manning Publications brief contents PART 1DEMYSTIFYING METAPROGRAMMING ..............................1 1 ■ Metaprogramming concepts 3 2 ■ Exploring code and metadata with reflection 41 PART 2TECHNIQUES FOR GENERATING CODE ..........................63 3 ■ The Text Template Transformation Toolkit (T4) 65 4 ■ Generating code with the CodeDOM 101 5 ■ Generating code with Reflection.Emit 139 6 ■ Generating code with expressions 171 7 ■ Generating code with IL rewriting 199 PART 3LANGUAGES AND TOOLS ............................................221 8 ■ The Dynamic Language Runtime 223 9 ■ Languages and tools 267 10 ■ Managing the .NET Compiler 287 v Metaprogramming concepts In this chapter ■ Defining metaprogramming ■ Exploring examples of metaprogramming The basic principles of object-oriented programming (OOP) are understood by most software developers these days. For example, you probably understand how encapsulation and implementation-hiding can increase the cohesion of classes. Languages like C# and Visual Basic are excellent for creating so-called coarsely grained types because they expose simple features for grouping and hiding both code and data. You can use cohesive types to raise the abstraction level across a system, which allows for loose coupling to occur. Systems that enjoy loose cou- pling at the top level are much easier to maintain because each subsystem isn’t as dependent on the others as they could be in a poor design. Those benefits are realized at the lower levels, too, typically through lowered complexity and greater reusability of classes. In figure 1.1, which of the two systems depicted would likely be easier to modify? Without knowing what the gray circles represent, most developers would pick the diagram on the right as the better one.
    [Show full text]
  • Castor: Programming with Extensible Generative Visitors
    Castor: Programming with Extensible Generative Visitors Weixin Zhanga,∗, Bruno C. d. S. Oliveiraa aThe University of Hong Kong, Hong Kong, China Abstract Much recent work on type-safe extensibility for Object-Oriented languages has focused on design patterns that require modest type system features. Examples of such design patterns include Object Algebras, Extensible Visitors, Finally Tagless interpreters, or Polymorphic Embeddings. Those techniques, which often use a functional style, can solve basic forms of the Expression Problem. However, they have important limitations. This paper presents Castor: a Scala framework for programming with extensible, generative visitors. Castor has several advantages over previous approaches. Firstly, Castor comes with support for (type-safe) pattern matching to complement its visitors with a concise notation to express operations. Secondly, Castor supports type-safe interpreters (à la Finally Tagless), but with additional support for pattern matching and a generally recursive style. Thirdly, Castor enables many operations to be defined using an imperative style, which is significantly more performant than a functional style (especially in the JVM platform). Finally, functional techniques usually only support tree structures well, but graph structures are poorly supported. Castor supports type-safe extensible programming on graph structures. The key to Castor’s usability is the use of annotations to automatically generate large amounts of boilerplate code to simplify programming with extensible visitors. To illustrate the applicability of Castor we present several applications and two case studies. The first case study compares the ability of Castor for modularizing the interpreters from the “Types and Programming Languages” book with previous modularization work. The second case study on UML activity diagrams illustrates the imperative aspects of Castor, as well as its support for hierarchical datatypes and graphs.
    [Show full text]
  • The Database Language 95030
    Tbe Database Language GEM Carlo Zaniolo Bell Laboratories Holmdel, New Jersey 07733 ABSTRACT GEM (bn acronym for General Entity Manipulator) is a general-purpose query and update language for the DSIS data model, which is a semantic data model of the Entity-Relationship type. GEM is designed as -an easy-to-use extension of the relational language QUEL. providing supporr for. the notions of entities with surrogates, aggregation, generalization, null values, and set-valued attributes. 1. INTRODUCTION generalization. The possibility of extending the relational model to capture more meaning - as A main thrust of computer technology is towards opposed to introducing a new model - was simplicity and ease of use. Database management investigated in [CoddZl, where, surrogates and null systems have come a long way in this respect, values were found necessaryfor the task. particularly after the introduction of the relational approach [Ullml, which provides users with a simple ,-‘Most previous work with semantic data models has tabular view of data and powerful and convenient concentrated on the problem of modeling reality and query languages for interrogating and manipulating on schema design; also the problem of integrating the database. These features were shown to be the the database into a programming environment key to reducing the cost of database-intensive supporting abstract data types has received application programming Ecddll and to providing considerable attention [Brad, KMCI. However, the a sound environment for back-end support and
    [Show full text]
  • An Abstract, Reusable, and Extensible Programming Language Design Architecture⋆
    An Abstract, Reusable, and Extensible Programming Language Design Architecture⋆ Hassan A¨ıt-Kaci Universit´eClaude Bernard Lyon 1 Villeurbanne, France [email protected] Abstract. There are a few basic computational concepts that are at the core of all programming languages. The exact elements making out such a set of concepts determine (1) the specific nature of the computational services such a language is designed for, (2) for what users it is intended, and (3) on what devices and in what environment it is to be used. It is therefore possible to propose a set of basic build- ing blocks and operations thereon as combination procedures to enable program- ming software by specifying desired tasks using a tool-box of generic constructs and meta-operations. Syntax specified through LALR(k) grammar technology can be enhanced with greater recognizing power thanks to a simple augmentation of yacc technology. Upon this basis, a set of implementable formal operational semantics constructs may be simply designed and generated (syntax and seman- tics) `ala carte, by simple combination of its desired features. The work presented here, and the tools derived from it, may be viewed as a tool box for generating lan- guage implementations with a desired set of features. It eases the automatic prac- tical generation of programming language pioneered by Peter Landin’s SECD Machine. What is overviewed constitutes a practical computational algebra ex- tending the polymorphically typed λ-Calculus with object/classes and monoid comprehensions. This paper describes a few of the most salient parts of such a system, stressing most specifically any innovative features—formal syntax and semantics.
    [Show full text]
  • Reviews Computer Applications Programs
    REVIEWS COMPUTER APPLICATIONS PROGRAMS INTRODUCTION TO SOFTWARE REVIEWS By Glen McAninch, University of Kentucky "You've come a long way baby!!!" Archivists have come a long way from the development of SPINDEX in the early 1970s. Back then, it seemed that archiv- ists needed software written for mainframe computers in order to have the flex- ibility (i. e. variable length fields and user designated tags) and storage capacity necessary to store and retrieve information on archival material. While SPIN- DEX has not been completely superceded for certain applications (i. e., print- ing lists for large institutions with a high volume of data), archivists in the 1980s have found that micro- and minicomputers can provide a relatively low cost alternative which can offer new features such as on-line searching and the generation of statistics. It was in the early 1980s that I first heard of archivists using commercially available software on microcomputers. They were using word processors and file managers like PFS, Data Factory, and DB Master on Apple computers with limited memory. Later, archivists stepped up to IBM PCs and look-alikes with two or three times the internal memory of the Apples. This enabled them to use more sophisticated relational database managers such as dBase II and later SAVVY, which require programming through the use of a command language. Archivists also began mixing powerful computers and mass storage (fixed disk) with simple file managers, some of which are reviewed in this issue of The Mid- western Archivist. Another software package that took advantage of the added power of the IBM PC was MARCON, a'database manager written especially for archivists by AIRS, a software developer group-with an archival background.
    [Show full text]
  • 201604261441 Merged2.Pdf
    Demystifying secure computation: Familiar abstractions for efficient protocols A dissertation Submitted to the department of Computer Science Of University of Virginia In fulfillment of the requirements For the degree of Doctor of Philosophy Samee Zahur April 2016 Abstract Over the past few years, secure multi-party computation (MPC) has been transformed from a research tool to a practical one with numerous interesting applications in practice. MPC is a cryptographic technique that allows two or more parties to collaboratively perform a computation without revealing their own private inputs to each other (other than what can be inferred from the output result). Example uses include private auctions where all the participants keep their bids private, private aggregation of corporate-internal data for economic analysis, and private set intersection. However, efficiency of MPC protocols have remained a persistent challenge for many applications. One particular issue that we examine in this dissertation is input-dependent memory accesses. It is difficult to efficiently access a memory location without revealing which element is being accessed, which in turn makes it very difficult to efficiently implement certain programs. This dissertation solves the problem by separately considering two different cases. First, we construct efficient circuit structures for cases where the access pattern is known to follow certain constraints, such as locality. The second case involves a new Oblivious RAM (ORAM) construction that provides general random access. The ORAM construction is slower than the specialized circuit structures, but faster than existing ORAM constructions for MPC for a large range of parameters. To help in implementing and evaluating these constructions, we also designed a new extensible programming language for MPC called Obliv-C, which we believe can be a useful contribution in its own right.
    [Show full text]