Theoretical Computer Science Knowledge a Selection of Latest Research Visit Springer.Com for an Extensive Range of Books and Journals in the field
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Anna Lysyanskaya Curriculum Vitae
Anna Lysyanskaya Curriculum Vitae Computer Science Department, Box 1910 Brown University Providence, RI 02912 (401) 863-7605 email: [email protected] http://www.cs.brown.edu/~anna Research Interests Cryptography, privacy, computer security, theory of computation. Education Massachusetts Institute of Technology Cambridge, MA Ph.D. in Computer Science, September 2002 Advisor: Ronald L. Rivest, Viterbi Professor of EECS Thesis title: \Signature Schemes and Applications to Cryptographic Protocol Design" Massachusetts Institute of Technology Cambridge, MA S.M. in Computer Science, June 1999 Smith College Northampton, MA A.B. magna cum laude, Highest Honors, Phi Beta Kappa, May 1997 Appointments Brown University, Providence, RI Fall 2013 - Present Professor of Computer Science Brown University, Providence, RI Fall 2008 - Spring 2013 Associate Professor of Computer Science Brown University, Providence, RI Fall 2002 - Spring 2008 Assistant Professor of Computer Science UCLA, Los Angeles, CA Fall 2006 Visiting Scientist at the Institute for Pure and Applied Mathematics (IPAM) Weizmann Institute, Rehovot, Israel Spring 2006 Visiting Scientist Massachusetts Institute of Technology, Cambridge, MA 1997 { 2002 Graduate student IBM T. J. Watson Research Laboratory, Hawthorne, NY Summer 2001 Summer Researcher IBM Z¨urich Research Laboratory, R¨uschlikon, Switzerland Summers 1999, 2000 Summer Researcher 1 Teaching Brown University, Providence, RI Spring 2008, 2011, 2015, 2017, 2019; Fall 2012 Instructor for \CS 259: Advanced Topics in Cryptography," a seminar course for graduate students. Brown University, Providence, RI Spring 2012 Instructor for \CS 256: Advanced Complexity Theory," a graduate-level complexity theory course. Brown University, Providence, RI Fall 2003,2004,2005,2010,2011 Spring 2007, 2009,2013,2014,2016,2018 Instructor for \CS151: Introduction to Cryptography and Computer Security." Brown University, Providence, RI Fall 2016, 2018 Instructor for \CS 101: Theory of Computation," a core course for CS concentrators. -
What Every Computer Scientist Should Know About Floating-Point Arithmetic
What Every Computer Scientist Should Know About Floating-Point Arithmetic DAVID GOLDBERG Xerox Palo Alto Research Center, 3333 Coyote Hill Road, Palo Alto, CalLfornLa 94304 Floating-point arithmetic is considered an esotoric subject by many people. This is rather surprising, because floating-point is ubiquitous in computer systems: Almost every language has a floating-point datatype; computers from PCs to supercomputers have floating-point accelerators; most compilers will be called upon to compile floating-point algorithms from time to time; and virtually every operating system must respond to floating-point exceptions such as overflow This paper presents a tutorial on the aspects of floating-point that have a direct impact on designers of computer systems. It begins with background on floating-point representation and rounding error, continues with a discussion of the IEEE floating-point standard, and concludes with examples of how computer system builders can better support floating point, Categories and Subject Descriptors: (Primary) C.0 [Computer Systems Organization]: General– instruction set design; D.3.4 [Programming Languages]: Processors —compders, optirruzatzon; G. 1.0 [Numerical Analysis]: General—computer arithmetic, error analysis, numerzcal algorithms (Secondary) D. 2.1 [Software Engineering]: Requirements/Specifications– languages; D, 3.1 [Programming Languages]: Formal Definitions and Theory —semantZcs D ,4.1 [Operating Systems]: Process Management—synchronization General Terms: Algorithms, Design, Languages Additional Key Words and Phrases: denormalized number, exception, floating-point, floating-point standard, gradual underflow, guard digit, NaN, overflow, relative error, rounding error, rounding mode, ulp, underflow INTRODUCTION tions of addition, subtraction, multipli- cation, and division. It also contains Builders of computer systems often need background information on the two information about floating-point arith- methods of measuring rounding error, metic. -
Cryptography
Security Engineering: A Guide to Building Dependable Distributed Systems CHAPTER 5 Cryptography ZHQM ZMGM ZMFM —G. JULIUS CAESAR XYAWO GAOOA GPEMO HPQCW IPNLG RPIXL TXLOA NNYCS YXBOY MNBIN YOBTY QYNAI —JOHN F. KENNEDY 5.1 Introduction Cryptography is where security engineering meets mathematics. It provides us with the tools that underlie most modern security protocols. It is probably the key enabling technology for protecting distributed systems, yet it is surprisingly hard to do right. As we’ve already seen in Chapter 2, “Protocols,” cryptography has often been used to protect the wrong things, or used to protect them in the wrong way. We’ll see plenty more examples when we start looking in detail at real applications. Unfortunately, the computer security and cryptology communities have drifted apart over the last 20 years. Security people don’t always understand the available crypto tools, and crypto people don’t always understand the real-world problems. There are a number of reasons for this, such as different professional backgrounds (computer sci- ence versus mathematics) and different research funding (governments have tried to promote computer security research while suppressing cryptography). It reminds me of a story told by a medical friend. While she was young, she worked for a few years in a country where, for economic reasons, they’d shortened their medical degrees and con- centrated on producing specialists as quickly as possible. One day, a patient who’d had both kidneys removed and was awaiting a transplant needed her dialysis shunt redone. The surgeon sent the patient back from the theater on the grounds that there was no urinalysis on file. -
Graph Visualization and Navigation in Information Visualization 1
HERMAN ET AL.: GRAPH VISUALIZATION AND NAVIGATION IN INFORMATION VISUALIZATION 1 Graph Visualization and Navigation in Information Visualization: a Survey Ivan Herman, Member, IEEE CS Society, Guy Melançon, and M. Scott Marshall Abstract—This is a survey on graph visualization and navigation techniques, as used in information visualization. Graphs appear in numerous applications such as web browsing, state–transition diagrams, and data structures. The ability to visualize and to navigate in these potentially large, abstract graphs is often a crucial part of an application. Information visualization has specific requirements, which means that this survey approaches the results of traditional graph drawing from a different perspective. Index Terms—Information visualization, graph visualization, graph drawing, navigation, focus+context, fish–eye, clustering. involved in graph visualization: “Where am I?” “Where is the 1 Introduction file that I'm looking for?” Other familiar types of graphs lthough the visualization of graphs is the subject of this include the hierarchy illustrated in an organisational chart and Asurvey, it is not about graph drawing in general. taxonomies that portray the relations between species. Web Excellent bibliographic surveys[4],[34], books[5], or even site maps are another application of graphs as well as on–line tutorials[26] exist for graph drawing. Instead, the browsing history. In biology and chemistry, graphs are handling of graphs is considered with respect to information applied to evolutionary trees, phylogenetic trees, molecular visualization. maps, genetic maps, biochemical pathways, and protein Information visualization has become a large field and functions. Other areas of application include object–oriented “sub–fields” are beginning to emerge (see for example Card systems (class browsers), data structures (compiler data et al.[16] for a recent collection of papers from the last structures in particular), real–time systems (state–transition decade). -
Experimental Algorithmics from Algorithm Desig
Lecture Notes in Computer Science 2547 Edited by G. Goos, J. Hartmanis, and J. van Leeuwen 3 Berlin Heidelberg New York Barcelona Hong Kong London Milan Paris Tokyo Rudolf Fleischer Bernard Moret Erik Meineche Schmidt (Eds.) Experimental Algorithmics From Algorithm Design to Robust and Efficient Software 13 Volume Editors Rudolf Fleischer Hong Kong University of Science and Technology Department of Computer Science Clear Water Bay, Kowloon, Hong Kong E-mail: [email protected] Bernard Moret University of New Mexico, Department of Computer Science Farris Engineering Bldg, Albuquerque, NM 87131-1386, USA E-mail: [email protected] Erik Meineche Schmidt University of Aarhus, Department of Computer Science Bld. 540, Ny Munkegade, 8000 Aarhus C, Denmark E-mail: [email protected] Cataloging-in-Publication Data applied for A catalog record for this book is available from the Library of Congress. Bibliographic information published by Die Deutsche Bibliothek Die Deutsche Bibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data is available in the Internet at <http://dnb.ddb.de> CR Subject Classification (1998): F.2.1-2, E.1, G.1-2 ISSN 0302-9743 ISBN 3-540-00346-0 Springer-Verlag Berlin Heidelberg New York This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer-Verlag. -
The Computer Scientist As Toolsmith—Studies in Interactive Computer Graphics
Frederick P. Brooks, Jr. Fred Brooks is the first recipient of the ACM Allen Newell Award—an honor to be presented annually to an individual whose career contributions have bridged computer science and other disciplines. Brooks was honored for a breadth of career contributions within computer science and engineering and his interdisciplinary contributions to visualization methods for biochemistry. Here, we present his acceptance lecture delivered at SIGGRAPH 94. The Computer Scientist Toolsmithas II t is a special honor to receive an award computer science. Another view of computer science named for Allen Newell. Allen was one of sees it as a discipline focused on problem-solving sys- the fathers of computer science. He was tems, and in this view computer graphics is very near especially important as a visionary and a the center of the discipline. leader in developing artificial intelligence (AI) as a subdiscipline, and in enunciating A Discipline Misnamed a vision for it. When our discipline was newborn, there was the What a man is is more important than what he usual perplexity as to its proper name. We at Chapel Idoes professionally, however, and it is Allen’s hum- Hill, following, I believe, Allen Newell and Herb ble, honorable, and self-giving character that makes it Simon, settled on “computer science” as our depart- a double honor to be a Newell awardee. I am pro- ment’s name. Now, with the benefit of three decades’ foundly grateful to the awards committee. hindsight, I believe that to have been a mistake. If we Rather than talking about one particular research understand why, we will better understand our craft. -
Algorithmics and Modeling Aspects of Network Slicing in 5G and Beyonds Network: Survey
Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000. Digital Object Identifier 10.1109/ACCESS.2017.DOI Algorithmics and Modeling Aspects of Network Slicing in 5G and Beyonds Network: Survey FADOUA DEBBABI1,2, RIHAB JMAL2, LAMIA CHAARI FOURATI2,AND ADLENE KSENTINI.3 1University of Sousse, ISITCom, 4011 Hammam Sousse,Tunisia (e-mail: [email protected]) 2Laboratory of Technologies for Smart Systems (LT2S) Digital Research Center of Sfax (CRNS), Sfax, Tunisia (e-mail: [email protected] [email protected] ) 3Eurecom, France (e-mail: [email protected] ) Corresponding author: Rihab JMAL (e-mail:[email protected]). ABSTRACT One of the key goals of future 5G networks is to incorporate many different services into a single physical network, where each service has its own logical network isolated from other networks. In this context, Network Slicing (NS)is considered as the key technology for meeting the service requirements of diverse application domains. Recently, NS faces several algorithmic challenges for 5G networks. This paper provides a review related to NS architecture with a focus on relevant management and orchestration architecture across multiple domains. In addition, this survey paper delivers a deep analysis and a taxonomy of NS algorithmic aspects. Finally, this paper highlights some of the open issues and future directions. INDEX TERMS Network Slicing, Next-generation networking, 5G mobile communication, QoS, Multi- domain, Management and Orchestration, Resource allocation. I. INTRODUCTION promising solutions to such network re-engineering [3]. A. CONTEXT AND MOTIVATION NS [9] allows the creation of several Fully-fledged virtual networks (core, access, and transport network) over the HE high volume of generated traffics from smart sys- same infrastructure, while maintaining the isolation among tems [1] and Internet of Everything applications [2] T slices.The FIGURE 1 shows the way different slices are complicates the network’s activities and management of cur- isolated. -
Computer Science Average Starting Salary
What can I do with a major in… CSE Career Outcomes Computer Science Average Starting Salary: Computer and information technology impacts many areas of our daily lives from $73,854* downloading a song to driving a car. Because many of our daily tasks involve the use of technology, computer scientists can be found in nearly all professional sectors, Post-graduation Outcomes:* including big technology firms, government agencies, startups, nonprofits, and local businesses, both large and small. Computer science majors possess a broad variety of skills that make them valuable to all businesses and there is an increasing need for industry to have knowledgeable computer professionals. At the heart of the computer scientist is a passion to benefit society by solving problems through computer and information technology. They conceive, design, and test logical structures for solving problems by computer and find ways to do so by designing applications and writing software to make computers do new things or accomplish tasks more efficiently. This may include, creating applications for mobile devices, writing web-based applications to power e-commerce and social networking sites, developing large enterprise systems for financial institutions, creating control software for robots, programming the next blockbuster video game, or identifying genes for the next biotech breakthrough. All of these advancements may involve writing detailed instructions that list the order of steps a computer must follow to accomplish a necessary function, developing methods for computerizing business and scientific tasks, maximizing efficiency of computer systems already in use, or enhancing or building immersive systems so people are better able to socialize and interact with technology Computer scientists often work on a more abstract level than other computer professionals. -
Mathematics and Computer Science
Mathematics and Computer Science he Mathematics and Computer Science Ma 315, Probability and Statistics TDepartment at Benedictine College is Ma 356, Modern Algebra I committed to maintaining a curriculum that Ma 360, Modern Algebra II or provides students with the necessary tools Ma 480, Introduction to Real Analysis to enter a career in their field with a broad, Ma 488, Senior Comprehensive solid knowledge of mathematics or computer Ma 493, Directed Research science. Our students are provided with the six hours of upper-division math electives knowledge, analytical, and problem solving and Cs 114, Introduction to Computer skills necessary to function as mathematicians Science I or Cs 230, Programming for or computer scientists in our world today. Scientists and Engineers The mathematics curriculum prepares stu- dents for graduate study, for responsible posi- Requirements for a major in tions in business, industry, and government, Computer Science: and for teaching positions in secondary and Cs 114, Introduction to Computer Science I elementary schools. Basic skills and tech- Cs 115, Introduction to Computer Science II niques provide for entering a career as an Ma 255, Discrete Mathematical Structures I actuary, banker, bio-mathematician, computer Cs 256, Discrete Mathematical Structures II programmer, computer scientist, economist, Cs 300, Information & Knowledge engineer, industrial researcher, lawyer, man- Management Cs 351, Algorithm Design and Data Analysis agement consultant, market research analyst, Cs 421, Computer Architecture mathematician, mathematics teacher, opera- Cs 440, Operating Systems and Networking tions researcher, quality control specialist, Cs 488, Senior Comprehensive statistician, or systems analyst. Cs 492, Software Development and Computer science is an area of study that Professional Practice is important in the technological age in which Cs 493, Senior Capstone we live. -
What Is Computer Science All About?
WHAT IS COMPUTER SCIENCE ALL ABOUT? H. Conrad Cunningham and Pallavi Tadepalli Department of Computer and Information Science University of Mississippi COMPUTERS EVERYWHERE As scientific and engineering disciplines go, computer science is still quite young. Although the mathematical roots of computer science go back more than a thousand years, it is only with the invention of the programmable electronic digital computer during the World War II era of the 1930s and 1940s that the modern discipline of computer science began to take shape. As it has developed, computer science includes theoretical studies, experimental methods, and engineering design all in one discipline. One of the first computers was the ENIAC (Electronic Numerical Integrator and Computer), developed in the mid-1940s at the University of Pennsylvania. When construction was completed in 1946, the ENIAC cost about $500,000. In today’s terms, that is about $5,000,000. It weighed 30 tons, occupied as much space as a small house, and consumed 160 kilowatts of electric power. Figure 1 is a classic U.S. Army photograph of the ENIAC. The ENIAC and most other computers of that era were designed for military purposes, such as calculating firing tables for artillery. As a result, many observers viewed the market for such devices to be quite small. The observers were wrong! Figure 1. ENIAC in Classic U.S. Army Photograph 1 Electronics technology has improved greatly in 60 years. Today, a computer with the capacity of the ENIAC would be smaller than a coin from our pockets, would consume little power, and cost just a few dollars on the mass market. -
A Generative Middleware Specialization Process for Distributed Real-Time and Embedded Systems
A Generative Middleware Specialization Process for Distributed Real-time and Embedded Systems Akshay Dabholkar and Aniruddha Gokhale ∗Dept. of EECS, Vanderbilt University Nashville, TN 37235, USA Email: {aky,gokhale}@dre.vanderbilt.edu Abstract—General-purpose middleware must often be special- domain and product variant – a process we call middleware ized for resource-constrained, real-time and embedded systems specialization. to improve their response-times, reliability, memory footprint, and even power consumption. Software engineering techniques, Most prior efforts at specializing middleware (and other such as aspect-oriented programming (AOP), feature-oriented system artifacts) [1]–[6] often require manual efforts in iden- programming (FOP), and reflection make the specialization task tifying opportunities for specialization and realizing them on simpler, albeit still requiring the system developer to manually identify the system invariants, and sources of performance the software artifacts. At first glance it may appear that these and memory footprint bottlenecks that determine the required manual efforts are expended towards addressing problems that specializations. Specialization reuse is also hampered due to a are purely accidental in nature. A close scrutiny, however, lack of common taxonomy to document the recurring specializa- reveals that system developers face a number of inherent tions. This paper presents the GeMS (Generative Middleware complexities as well, which stem from the following reasons: Specialization) framework to address these challenges. We present results of applying GeMS to a Distributed Real-time 1. Spatial disparity between OO-based middleware design and Embedded (DRE) system case study that depict a 21-35% and domain-level concerns - Middleware is traditionally ˜ reduction in footprint, and a 36% improvement in performance designed using object-oriented (OO) principles, which enforce while simultaneously alleviating 97%˜ of the developer efforts in specializing middleware. -
What to Do Till the Computer Scientist Comes
WHAT TO DO TILL THE COMPUTER SCIENTIST COMES BY G. E. FORSYTHE Reprinted from the American Mathematical Monthly Vol. 75, No. 5, May, 1968 WHAT TO DO TILL THE COMPUTER SCIENTIST COMES* « GEORGE E. FORSYTHE, Computer Science Department, Stanford University Computer science departments. What is computer science anyway? This is a favorite topic in computer science department meetings. Just as with definitions of mathematics, there is less than total agreement and—moreover—you must know a good deal about the subject before any definition makes sense. Perhaps the tersest answer is given by Newell, Perlis, and Simon [8]: just as zoology is "-a the study of animals, so computer science is the study of computers. They explain that it includes the hardware, the software, and the useful algorithms 4 computers perform. J believe they would also include the study of computers rt that might be built, given sufficient demand and sufficient development in the technology. In an earlier paper [4], the author defines computer science as the art and science of representing and processing information. Some persons [10] extend the subject to include a study of the structure of information in nature (e.g., the genetic code). Computer scientists work in three distinguishable areas: (1) design of hard- ware components and especially total systems; (2) design of basic languages and software broadly useful in applications, including monitors, compilers, time- sharing S3 -stems, etc.; (3) methodology of problem solving with computers. The accent here is on the principles of problem solving—those techniques that are common to solving broad classes of problems, as opposed to the preparation of individual programs to solve single problems.