NV: an Intermediate Language for Verification of Network Control Planes

Total Page:16

File Type:pdf, Size:1020Kb

NV: an Intermediate Language for Verification of Network Control Planes NV: An Intermediate Language for Verification of Network Control Planes Nick Giannarakis Devon Loehr Princeton University Princeton University USA USA [email protected] [email protected] Ryan Beckett David Walker Microsoft Research Princeton University USA USA [email protected] [email protected] Abstract Control Planes. In Proceedings of the 41st ACM SIGPLAN Interna- Network misconfiguration has caused a raft of high-profile tional Conference on Programming Language Design and Implemen- outages over the past decade, spurring researchers to de- tation (PLDI ’20), June 15–20, 2020, London, UK. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3385412.3386019 velop a variety of network analysis and verification tools. Unfortunately, developing and maintaining such tools is an 1 Introduction enormous challenge due to the complexity of network con- figuration languages. Inspired by work on intermediate lan- Following the explosive rise in the number of internet users guages for verification such as Boogie and Why3, we develop and the emergence of cloud computing, networks have seen NV, an intermediate language for verification of network significant growth in their size and complexity. However, control planes. NV carefully walks the line between expres- network operators have a hard time keeping up with this siveness and tractability, making it possible to build models unprecedented growth: in the past few years, network mis- for a practical subset of real protocols and their configu- configuration incidents have caused outages whose effects rations, and also facilitate rapid development of tools that range from popular websites like Facebook being inaccessi- outperform state-of-the-art simulators (seconds vs minutes) ble [42] to cloud services going offline [35, 41, 46], to airlines and verifiers (often 10x faster). Furthermore, we show thatit grounding their flights [23, 34, 37]. is possible to develop novel analyses just by writing new NV To help improve network reliability, researchers have de- programs. In particular, we implement a new fault-tolerance veloped numerous verification and testing tools over the last analysis that scales to far larger networks than existing tools. decade. One wave of tools included Anteater [33], Header- Space Analysis (HSA) [26], Veriflow [27], NetKAT [4, 18, 44], CCS Concepts: • Networks ! Protocol testing and ver- NoD [31] and Bayonet [19]. These tools were designed to ification; • Theory of computation ! Automated rea- analyze the network data plane, the component of a network soning; Verification by model checking. responsible for forwarding (or blocking) user traffic from Keywords: Network Verification, Network Simulation, Con- point A to point B. The tools developed can check both de- trol Plane Analysis, Router Configuration Analysis, Interme- terministic and probabilistic properties of how packets flow diate Verification Language through the data plane, and scale to networks with thousands of devices and millions of packet-forwarding rules. ACM Reference Format: A second wave of tools has focused on analyzing the net- Nick Giannarakis, Devon Loehr, Ryan Beckett, and David Walker. work control plane. The control plane is the component of 2020. NV: An Intermediate Language for Verification of Network a network that gathers information about available routes Permission to make digital or hard copies of all or part of this work for (paths) to destinations and chooses which routes to use. Once personal or classroom use is granted without fee provided that copies the control plane has chosen its preferred routes, it passes are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights them off to the data plane, which implements them. From for components of this work owned by others than the author(s) must time to time, the control plane will receive new information be honored. Abstracting with credit is permitted. To copy otherwise, or about the available routes (e.g., when failures occur), and republish, to post on servers or to redistribute to lists, requires prior specific when it does, it may compute new routes to a destination. permission and/or a fee. Request permissions from [email protected]. Tools that analyze the control plane, such as C-BGP [40], PLDI ’20, June 15–20, 2020, London, UK rcc [15], Batfish [17], ARC [21], ERA [14], Bagpipe [49], © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. MineSweeper [6], FastPlane [32], and ShapeShifter [8], will ACM ISBN 978-1-4503-7613-6/20/06...$15.00 analyze the router configurations that specify how routes are https://doi.org/10.1145/3385412.3386019 to be chosen and will compute the outcome of those choices PLDI ’20, June 15–20, 2020, London, UK Nick Giannarakis, Devon Loehr, Ryan Beckett, and David Walker (i.e., compute the routes that will be used by the data plane). and, in this case, networking. Fortunately, we can simplify These tools can then answer questions such as whether the the task by first parsing source languages into a surface-level control plane will compute a route from A to B (i.e., control IR (i.e. Batfish’s IR), and then encoding the semantics ofthis plane reachability), or whether it will compute route from A IR into a lower-level IR designed specifically for verification. to B regardless of which single link failure occurs (i.e., con- Such an architecture separates key concerns and simplifies trol plane fault tolerance). However, control plane tools are each aspect of the pipeline. Indeed, this methodology has usually less scalable than the data plane tools, particularly been used successfully for general-purpose programming when it comes to checking properties like control plane fault languages in systems such as Boogie [30], Why [16], and tolerance, which may involve computing different routes CIVL [43]. The key challenge comes in the design of the from A to B for every possible failure scenario. low-level intermediate language for verification: it should This paper focuses exclusively on control plane analysis be succint, expressive enough to encode the semantics of tools and the difficulty of building such tools.1 Much of this the source language, and tractable enough to admit efficient difficulty arises from the fact that router vendors suchas analysis of its features. Cisco and Juniper each have their own proprietary config- uration languages that consist of an enormous number of A functional language for modeling routing protocols. ad hoc commands that are specific to a particular protocol, Our first contribution is the design of NV, a functional lan- such as OSPF or BGP. To analyze a network control plane, guage for modeling routing protocols. NV uses conven- one must first parse these varied configuration languages tional, expressive and compositional constructs with ordinary, and interpret their semantics. broadly-understood semantics, such as integers, booleans, Fortunately, colleagues at Intentionet have built Batfish [17], functions, and records. Additionally, it allows users to ex- a tool that parses these configurations and creates an in- press unknowns, such as potential failures or actions of termediate representation (IR) on top of which researchers neighboring networks, in a general way, using symbolic val- can develop their own analyses. Indeed, researchers from a ues, and lets users specify network properties via assertions. number of different institutions have built verification and NV’s design provides building blocks for modeling both simulation tools using Batfish as a front end[6–8, 14, 20, 21]. standard routing protocols and variations thereon. The latter While Batfish is a remarkably useful front end for manag- is quite useful as large cloud providers have been known ing configurations, its IR does not fundamentally change the to tweak standard protocols for internal use; accommodat- abstractions present in the vendor languages it supports—it ing these tweaks is easier in NV. In addition, these building represents those surface-level abstractions directly. Doing so blocks can be used to implement transformations of NV pro- has the advantage of being expedient and straightforward, grams. Optimizations, such as partial evaluation, or transfor- but causes Batfish to inherit some of the less desirable proper- mations, such as message [8] or topology [7, 22] abstractions ties of the configuration languages themselves. In particular, can be implemented as NV-to-NV transformations, indepen- the Batfish IR is quite verbose: A recent examination oftheIR dent of one another and of the back-end analysis used. Prior showed it contained 24 statements and more than 100 types to this research, designing a network verification tool in- of expressions. It has 19 different ways just to modify fields volved simultaneously interpreting the low-level commands of a routing message. Implementing configuration analysis from various vendors (Cisco, Juniper, Arista, Force10, etc.), back-ends like MineSweeper [6] requires understanding the optimizing their representation, and converting them to the semantics of all these different kinds of statements and ex- appropriate structure (e.g., SMT constraints) all at once. pressions, which is challenging. Moreover, Batfish continues The most difficult design challenge in NV was oriented to evolve, adding new vendor features, and with them new around the definition of a dictionary (finite map) type. These components to its IR. This makes it a challenge to maintain dictionaries need to be expressive enough to encode the key- analysis tools. It is even harder to develop and maintain tools value stores and the sets processed by most routing protocols, like ShapeShifter [8] that need to transform conventional yet simple enough to admit efficient encodings both as SMT routing protocols, as the Batfish IR does not provide the formulae for use in symbolic verification techniques, and “building blocks” necessary to represent new, non-standard as multi-terminal binary decision diagrams (MTBDDs) for routing protocols. simulation-based analysis. Such efficient encodings are key to scaling NV to large networks.
Recommended publications
  • Advanced Concepts and Applications of the UB-Tree
    INSTITUT FUR¨ INFORMATIK d d d d d d d d DER TECHNISCHEN UNIVERSITAT¨ MUNCHEN¨ d d d d d d d d d d d d Advanced Concepts and Applications of the UB-Tree Robert Josef Widhopf-Fenk Institut f¨urInformatik der Technischen Universit¨atM¨unchen Advanced Concepts and Applications of the UB-Tree Robert Josef Widhopf-Fenk Vollst¨andiger Abdruck der von der Fakult¨at f¨urInformatik der Technischen Universit¨at M¨unchen zur Erlangung des akademischen Grades eines Doktors der Naturwissenschaften genehmigten Dissertation. Vorsitzende: Univ.-Prof. G. J. Klinker, Ph.D. Pr¨uferder Dissertation: 1. Univ.-Prof. R. Bayer, Ph.D., emeritiert 2. Univ.-Prof. A. Kemper, Ph.D. Die Dissertation wurde am 05.08.2004 bei der Technischen Universit¨atM¨unchen eingereicht und durch die Fakult¨at f¨ur Informatik am 07.02.2005 angenommen. Acknowledgements First of all, I want to thank my supervisor Rudolf Bayer for his advice, guidance, support, and patience. Without his invitation to be a Ph.D. student, I would have never been working on database research. I owe my former colleagues in the MISTRAL project many thanks for fruitful discus- sions, proofreading, inspirations, and motivating me: Volker Markl, Frank Ramsak, Martin Zirkel and our “external” member Roland Pieringer. I like to thank my students, Oliver Nickel, Chris Hodges and Werner Unterhofer con- tributing to this work by supporting the implementation of algorithms and performing experiments. Special thanks to our project partners at TransAction, Teijin, GfK, and the members of the EDITH project. And last but not least, many thanks to Michael Bauer, Bernd Reiner, and Charlie Hahn for discussions, distraction and small talk.
    [Show full text]
  • Today's Problems, Yesterday's Toolkit
    TODAY’S PROBLEMS, YESTERDAY’S TOOLKIT Developed by Owned by and working for Australian and New Zealand Governments. 1 THE ROADMAP: TODAY’S PROBLEMS, YESTERDAY’S TOOLKIT 4 THE PUBLIC PROBLEM SOLVING IMPERATIVE 5 The Death of Trust 6 Why Public Problem Solving is So Urgent 7 Why Building Skills will Restore Trust 10 How Public Problems Differ 12 The Public Problem-Solving Pathway 14 INNOVATION SKILLS IN THE PUBLIC SECTOR 16 Definitions of Innovation Skills in the Public Sector 17 The Public Entrepreneur’s Skillset 18 Training for Innovation Skills 21 ANZSOG Survey of Innovation Skills 27 THE INSTITUTIONAL ENVIRONMENT 35 Features of Innovative Institutions 37 New Innovation Institutions 41 Policies to Catalyse Innovative Institutions 46 Talent Mobility: Moving Brains Around 48 Sustaining Innovative Institutions 50 CONCLUSION: LOOKING TO TOMORROW 53 Acknowledgements 56 About the Authors 57 ANNEX I 58 Section I. Respondent’s characteristics 60 Section II. Awareness of skill & training 63 Section III. Use of skill 67 Section IV. Interest in learning skills and learning preferences 71 Section V. Enabling environment 74 Section VI. Relationship between skill practice, training and environment 79 ANNEX II — AUSTRALIA AND NEW ZEALAND INNOVATION SKILLS SURVEY INSTRUMENT 83 ANNEX III — LIST OF INTERVIEWEES 106 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying or otherwise, without the prior permission of the publisher. © 2019 The Australia and New Zealand School of Government (ANZSOG) and the authors. 3 THE ROADMAP: TODAY’S PROBLEMS, YESTERDAY’S TOOLKIT Governments of all political stripes are being The report begins by explaining why public problem- buffeted by technological and societal change.
    [Show full text]
  • (12) United States Patent (10) Patent No.: US 8.473450 B2 Bakalash Et Al
    USOO847345OB2 (12) United States Patent (10) Patent No.: US 8.473450 B2 Bakalash et al. (45) Date of Patent: *Jun. 25, 2013 (54) RELATIONAL DATABASE MANAGEMENT (52) U.S. Cl. SYSTEM (RDBMS) EMPLOYING USPC ........................................... 707/600; 707/770 MULTI-DIMENSIONAL DATABASE (MDDB) (58) Field of Classification Search FOR SERVICING QUERY STATEMENTS None THROUGHONE ORMORE CLIENT See application file for complete search history. MACHINES (56) References Cited (75) Inventors: Reuven Bakalash, Beer Sheva (IL); Guy U.S. PATENT DOCUMENTS Shaked, Shdema (IL); Joseph Caspi, 4,590,465. A 5, 1986 Fuchs Herzlyia (IL) 4,598.400 A 7, 1986 Hillis (73) Assignee: Yanicklo Technology Limited Liability (Continued) Company, Wilmington, DE (US) FOREIGN PATENT DOCUMENTS (*) Notice: Subject to any disclaimer, the term of this EP O 314 279 5, 1989 patent is extended or adjusted under 35 EP O 657 052 6, 1995 U.S.C. 154(b) by 13 days. (Continued) OTHER PUBLICATIONS This patent is Subject to a terminal dis claimer. Scheuermann, P. J. Shim and R. Vingralek “WATCHMAN: A Data Warehouse Intelligent Cache Manager'. Proceedings of the 22nd International Conference on Very Large Databases (VLDB), 1996, (21) Appl. No.: 12/455,665 pp. 51-62.* (22) Filed: Jun. 4, 2009 (Continued) (65) Prior Publication Data Primary Examiner — Robert Timblin (74) Attorney, Agent, or Firm — Knobbe, Martens, Olson & US 2009/027641.0 A1 Nov. 5, 2009 Bear LLP Related U.S. Application Data (57) ABSTRACT A relational database management system (RDBMS) for ser (63) Continuation of application No. 1 1/888,904, filed on vicing query statements through one or more client machines.
    [Show full text]
  • Vasili Korol
    Vasili Korol Senior Software Developer Odense, Denmark Age: 35 mob.: +45 20 68 50 23 Married, have son (born 2010) e-mail: [email protected] ​ Personal Statement ⚬ Strong IT skills (16+ years of versatile experience) ⚬ Background in physics research ⚬ Work effectively both as team member and leader ⚬ Enthusiastic and committed ⚬ Spoken languages: Russian (native), English (fluent), Danish (Prøve i Dansk 3 / level B2) ​ ​ ​ ​ Education 2006–2008: Master’s degree (with distinction) in applied physics. ​ 2002–2006: Bachelor’s degree (with distinction) in applied physics. Under- to postgraduate student at St. Petersburg State Polytechnical University, Faculty of Physics and Technology, Dept. of Cosmic Physics. The thesis “Search for possible space-time variations of the fine-structure constant and isotopic shifts” (a supervisor Prof. ​ M.G. Kozlov). ​ 1992-2002: School education in St. Petersburg, Russia and Belfast, UK (in 1993). Professional Career 2015 – Feb 2021: Software developer in the QuantBio research group at the University of ​ ​ ​ ​ Southern Denmark (SDU), Institute of Physics, Chemistry and Pharmacy (HPC section). I am the principal developer of VIKING, a service providing a web interface for configuring ​ ​ ​ and running scientific computational tasks on supercomputers. I designed the software architecture, developed the system core and coordinated the work of several developers. 2014 – 2015: Lead programmer (Perl) at Internet Projects LLC, russian informational portals subscribe.ru and sendsay.ru (St. Petersburg, Russia). ​ ​ ​ Worked with a team of developers on projects targeted at developing an API for news aggregation and content processing services. This involved integration with various online platforms (Facebook, Twitter, Vkontakte, LiveJournal, Google Analytics), web scraping and designing instruments for user publications at the portals and beyond.
    [Show full text]
  • A Location Service for Partial Spatial Replicas Implementing an R-Tree in a Relational Database
    A Location Service for Partial Spatial Replicas Implementing an R-Tree in a Relational Database Yun Tian Department of Computer Science, Eastern Washington University Cheney, WA, USA 99004 Philip J. Rhodes Department of Computer and Information Science, University of Mississippi, University, MS, USA 38677 Abstract As parallel computing has become increasingly common, the need for scalable and efficient ways of storing and locating data has become increas- ingly acute. For years, both grid and cloud computing have distributed data across machines and even clusters at different geographic locations (sites). However not all sites need all of the data in a particular data set, or have the (perhaps specialized) processing capabilities required. These facts chal- lenge the conventional wisdom that we should always move the computation to the data rather than the data to the computation. Sometimes the data actually required is small. In other cases, the site with specialized processing capabilities (such as a GPU equipped cluster) cannot handle the demands placed on it unless a way is found to let that cluster select the data that is actually needed, even if it is not stored locally. Keywords: Globus Toolkit, Replica Location Service, R-tree, Prefetching, Spatial, Replica Email addresses: [email protected] (Yun Tian), [email protected] (Philip J. Rhodes) Preprint submitted to Journal of Parallel and Distributed Computing 1. Introduction Although the problems just described are general, this paper concentrates on the problem of computation with spatial data. Such data is often the product of large computations and simulations, and also natural observations. In any case, spatial data sets present special problems, including not only size but also the mismatch between the n-dimensional nature of the dataset and its representation on disk.
    [Show full text]
  • Integrating the UB-Tree Into a Database System Kernel
    Integrating the UB-Tree into a Database System Kernel Frank Ramsak1, Volker Markl1, Robert Fenk1, Martin Zirkel2, Klaus Elhardt3, Rudolf Bayer1,2 1Bayerisches Forschungszentrum 2Institut für Informatik 3TransAction Software GmbH für Wissensbasierte Systeme TU München Gustav-Heinemann-Ring 109, Orleansstraße 34, Orleansstraße 34, D-81739 München, Germany D- 81667 München, Germany D-81667 München, Germany {frank.ramsak, robert.fenk, volker.markl}@forwiss.de, {zirkel, bayer}@in.tum.de, [email protected] Abstract Multidimensional access methods have shown 1 Introduction high potential for significant performance im- Various research approaches in the past have shown that provements in various application domains. multidimensional access methods (MAMs) have a high However, only few approaches have made their impact on different database application domains like data way into commercial products. In commercial warehousing, data mining, or geographical information database management systems (DBMSs) the B- systems. However, despite the vast research effort MAMs Tree is still the prevalent indexing technique. have not made their way into commercial database Integrating new indexing methods into existing management systems on a broad scale. This is mostly due database kernels is in general a very complex to the fact that the integration of these complex data and costly task. Exceptions exist, as our experi- structures into an existing database kernel is fairly ence of integrating the UB-Tree into TransBase, complicated. Especially concurrency and recovery issues, a commercial DBMS, shows. The UB-Tree is a which are as important as performance issues for very promising multidimensional index, which commercial systems, are major obstacles. For most has shown its superiority over traditional access MAMs new solutions to these problems, e.g., locking for methods in different scenarios, especially in R-Trees [KB95, CM98], have to be developed, as the new OLAP applications.
    [Show full text]
  • Transbase®: a Leading-Edge ROLAP Engine Supporting Multidimensional Indexing and Hierarchy Clustering∇ R
    Transbase®: A leading-edge ROLAP Engine supporting multidimensional Indexing and Hierarchy Clustering∇ R. Pieringer1, K. Elhardt1, F. Ramsak2, V. Markl3, R. Fenk2, R. Bayer2 1 Transaction 2 Bayerisches 3IBM Almaden Software GmbH Forschungszentrum für Research Center, Thomas-Dehler- Wissensbasierte Systeme K55/B1, Str. 18 Boltzmannstr. 3 650 Harry Road, San D-81737 München D-85747 München Jose, CA 95120- {pieringer,elhardt}@ {ramsak,fenk}@forwiss.de, 6099 transaction.de [email protected] [email protected] Abstract: Analysis-oriented database applications, such as data warehousing or customer relationship management, play a crucial role in the database area. In general, the multidimensional data model is used in these applications, realized as star or snow-flake schemata in the relational world. The so-called star queries are the prevalent type of queries on such schemata. All database vendors have extended their products to support star queries efficiently. However, mostly reporting queries benefit from the optimizations, like pre-aggregation, while ad-hoc queries usually lack efficient support. We present the DBMS Transbase® in this paper, which provides a new physical organization of the data based on hierarchical clustering and multidimensional clustering combined with multidimensional indexing. In combination with new query optimizations (e.g., hierarchical pre-grouping) significant performance improvements are achieved. The paper describes how the new technology is implemented in the Transbase® product and how it is made available to the user as transparently as possible. The benefits are illustrated with a real-world data warehousing scenario. 1 Introduction Data warehousing (DW), online analytical processing (OLAP), and customer relationship management (CRM), have become a major market in the database area through the last decade.
    [Show full text]
  • Transbase® System Guide Transbase System Guide Version 8.1
    Transbase® System Guide Transbase System Guide Version 8.1 Publication date 2018-05-14 Copyright © 2018 Transaction Software GmbH ALL RIGHTS RESERVED. While every precaution has been taken in the preparation of this document, the publisher assumes no responsibility for errors or omissions, or for damages resulting from the use of the information contained herein. Table of Contents About This Manual ...................................................................................................... vii 1. The Transbase DBMS ................................................................................................ 1 1.1. Transbase Packages ......................................................................................... 1 1.2. Transbase Platforms ........................................................................................ 1 1.2.1. Service Platforms .................................................................................. 2 1.2.2. Client Platforms ................................................................................... 2 2. Applications and Client Software ................................................................................. 3 2.1. Client Software ............................................................................................... 3 2.1.1. Database Browsers ................................................................................ 3 2.1.2. Programming Interfaces ......................................................................... 3 2.2. Database Connections
    [Show full text]
  • Case Study Transbase CD Customer AZ Direct Adress Data
    CASE STUDY Direct marketing supplier AZ Direct relies on Transbase® and Transaction Software's data- base expertise AZ Direct is one of Europe's leading direct marketing companies. AZ Direct offers innovative products and services relating to addresses, information, and information processing. Over almost four decades, the company has grown from a traditional direct marketing company to a universal information pro- vider for marketing. Many companies and agencies have used the ideas and solutions offered by AZ Direct to initiate a successful dialogue with their cus- tomers. AZ Direct is a subsidiary of arvato AG, the internationally networked media service provider of Bertelsmann AG. Bertelsmann AG has 42,000 employees worldwide. For further information, please visit: www.az-direct.de ConData arvato services is a division of AZ Direct. ConData maintains close relationships with top suppliers throughout the direct marketing value chain. For further information, please visit: www.arvato-condata.de Overview The ConTo application Industry: ConTo is a new concept in direct marketing. For the first time, this application Direct marketing combines one of the strongest and best-known consumer databases in Information service Germany with a powerful tool on one program DVD and two data DVDs. Clients receive a comprehensive and constantly updated database embedded Application: in the user-friendly ConTo program. The ConTo application is provided free of ConTo charge. With ConTo, users can easily work with a number of different feature combinations to assemble selections that are specifically customized for end Product: clients, or to optimize consumer address records by comparing them against Transbase® CD reference records.
    [Show full text]
  • Kiev, Ukraine, September 14-17, 1993)
    DOCUMENT RESUME ED 367 314 IR 016 592 AUTHOR Petrushin, V., Ed.; Dovgiallo, A., Ed. TITLE Computer Technologies in Education. Proceedinga of the International Conference on Computer Technologies in Education (Kiev, Ukraine, September 14-17, 1993). PUB DATE 93 NOTE 217p. PUB TYPE Collected Works Conference Proceedings (021) EDRS PRICE MF01/PC09 Plus Postage. DESCRIPTORS Artificial Intelligence; *Computer Assisted Instruction; Computer Literacy; Computer Simulation; Distance Education; Educational Media; *Educational Technology; Elementary Secondary Education; Foreign Countries; Higher Education; Hypermedia; *Information Technology; Learning Processes; Models; Multimedia Instruction; Programming IDENTIFIERS Intelligent Tutoring Systems; Knowledge Representation; Multimedia Materials; Ukraine; United States ABSTRACT The conference reported in this document provides a meeting place for researchers from around the world, where the emphasis is on new ideas connected to computer technologies in education. This volume contains 140 extended abstracts selected by the program committee and organized into the following categories: (1) educational informational technologies (14 titles);(2) multimedia and hypermedia learning environments (24 titles); (3) distance learning (14 titles);(4) knowledge-based systems in education (23 titles);(3) intelligent tutoring systems and their components (19 titles); (6) student modelling (10 titles); (7) simulation environments and teaching science (15 titles); (8) teaching languages and humanities (7 titles); and
    [Show full text]
  • Transbase® CD
    PRODUCT SHEET Publishing Central Databases Widely with Transbase® CD Transbase® CD is a special variant of the database system Transbase®, desi- gned and optimized for databases directly on CD-ROM or DVD-ROM. The range of use of Transbase® CD offers many advantages for the develop- ment and the operration of applications on CD and DVD. Transbase® CD provides for full SQL functionality (SQL 2003) Transbase® CD permits space saving distribution of database contents on CD/DVD-ROM Transbase® CD runs directly with the compressed database contents, thus minimizing local memory requirements. The integrated encryption and the privilege concepts provide an effective security for sensitive stored data. Transbase® CD offers the possibility to continuously update the data via internet using its update mechanism. Transbase® CD is perfect for applications which distribute central databases widely for autonomous operation. Publication of data on CD/DVD-ROM CD-ROM and DVD-ROM remain widely used for distributing information. These media prove themselves to be an important channel for distribution, even in the age of intensive internet usage. These media guarantee an auto- nomous use of the information in the event that use on-line is restricted or even impossible. This is an important aspect which has particular relevance to product information such as technical handbooks, service information and catalogues Database operation direct on CD/DVD-ROM For a long time SQL was considered too cumbersome and complex for these relatively slow media. Transbase® CD, however, is convincing with its full SQL functionality and impressive speed and performance. Page 1 of 6 PRODUCT SHEET This permits not only simple and clear programming, but also the immediate use of all the more pleasing features of SQL database systems: Automatic net- workability, multi-user support, data security monitored by the system, trans- actions and the powerful descriptive query language SQL.
    [Show full text]
  • Modelo Para a Gestão E Controlo Da Entrega Na Distribuição Logística Da Empresa Transbase
    Projecto Mestrado em Computação Móvel Modelo para a gestão e controlo da entrega na distribuição logística da empresa Transbase Sérgio Duarte Mota Leiria, Setembro de 2011 Projecto Mestrado em Comuptação Móvel Modelo para a gestão e controlo da entrega na distribuição logística da empresa Transbase Sérgio Duarte Mota Projecto de Mestrado realizada sob a orientação do Doutor Rui Rijo, Professor da Escola Superior de Tecnologia e Gestão do Instituto Politécnico de Leiria. Leiria, Setembro de 2011 “Acho que deve existir um mercado mundial para, talvez, cinco computadores.” Autor Thomas Watson, dirigente da IBM, 1943 Agradecimentos O sucesso na realização deste trabalho foi alcançado com o apoio e a colaboração de todos os que se enumeram de seguida. A todos gostaria de exprimir os meus mais sinceros agradecimentos e aqui reconhecer o seu contributo. Aos meus familiares e amigos, em especial à minha mãe que sempre me motivou e incentivou. À Ana Ribeiro que sempre me acompanhou nas diversas noites e fins de semana que passei a trabalhar, dando-me energia para continuar. Ao meu orientador Professor Doutor Rui Pedro Charters Lopes Rijo, pela sua disponibilidade, incentivo e motivação. A todos os colaboradores da Transbase que estiveram envolvidos na elaboração deste trabalho e sempre se encontraram disponíveis para me auxiliarem. O meu muito obrigado a todos. ix Resumo A evolução acentuada das tecnologias de informação e comunicação, que se tem vindo a assistir nas últimas décadas, tem permitido que estas se encontrem implementadas em praticamente todas as áreas de negócio. Assim, é com naturalidade que as tecnologias de informação tenham sido adoptadas na logística empresarial.
    [Show full text]