Hacking Interfaces: How to Control a Computer Using Your Mind
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
SESAR JU CONSOLIDATED ANNUAL ACTIVITY REPORT 2020 Abstract
SESAR JU CONSOLIDATED ANNUAL ACTIVITY REPORT 2020 Abstract This Consolidated Annual Activity Report, established on the guidelines set forth in Communication from the Commission ref. 2020/2297, provides comprehensive information on the implementation of the agency work programme, budget, staff policy plan, and management and internal control systems in 2020. © SESAR Joint Undertaking, 2021 Reproduction of text is authorised, provided the source is acknowledged. For any use or reproduction of photos, illustrations or artworks, permission must be sought directly from the copyright holders. COPYRIGHT OF IMAGES © Airbus S.A.S. 2021, page 50; © Alexa Mat/Shutterstock.com, page 209; © Alexandra Lande/Shutterstock.com, page 215; © AlexLMX/Shutterstock.com page 177; © chainarong06/Shutterstock.com, page 220; © DG Stock/ Shutterstock.com, cover; © Diana Opryshko page 155; © Dmitry Kalinovsky/Shutterstock.com, page 56; © iStock. com/Gordon Tipene, pages 189 and 194; © iStock.com/Nordroden, page 12; © iStock.com/sharply_done, page 209; © iStock.com/sharply_done, page 18; © iStock.com/stellalevi, page 228, © lassedesignen/Shutterstock.com, page 70 © Mario Hagen/Shutterstock.com, pages 36 and 130; © Michael Penner, page 130; © NickolayV/Shutterstock. com, page 77; © Sergey Peterman/Shutterstock.com, page 10; © SESAR JU, pages 9, 15, 16, 17, 48, 49, 55,79, 86, 102,132, 134, 145, 147, 148 and 190; © SFIO CRACHO/Shutterstock.com, pages 181 and 213; © Skycolors/ Shutterstock.com, page 40; © smolaw/Shutterstock.com, page 211; © Thiago B Trevisan/Shutterstock.com, page 136; © This Is Me/Shutterstock.com, page 175; © VLADGRIN/Shutterstock.com, page 191; © Limare/Shutterstock, page 193; © Photo by Chris Smith on Unsplash, page 227 © Photo by Julien Bessede on Unsplash, page 224 © Photo by Sacha Verheij on Unsplash, page 221 © yuttana Contributor Studio/Shutterstock.com, page 66. -
Clangjit: Enhancing C++ with Just-In-Time Compilation
ClangJIT: Enhancing C++ with Just-in-Time Compilation Hal Finkel David Poliakoff David F. Richards Lead, Compiler Technology and Lawrence Livermore National Lawrence Livermore National Programming Languages Laboratory Laboratory Leadership Computing Facility Livermore, CA, USA Livermore, CA, USA Argonne National Laboratory [email protected] [email protected] Lemont, IL, USA [email protected] ABSTRACT body of C++ code, but critically, defer the generation and optimiza- The C++ programming language is not only a keystone of the tion of template specializations until runtime using a relatively- high-performance-computing ecosystem but has proven to be a natural extension to the core C++ programming language. successful base for portable parallel-programming frameworks. As A significant design requirement for ClangJIT is that the runtime- is well known, C++ programmers use templates to specialize al- compilation process not explicitly access the file system - only gorithms, thus allowing the compiler to generate highly-efficient loading data from the running binary is permitted - which allows code for specific parameters, data structures, and so on. This capa- for deployment within environments where file-system access is bility has been limited to those specializations that can be identi- either unavailable or prohibitively expensive. In addition, this re- fied when the application is compiled, and in many critical cases, quirement maintains the redistributibility of the binaries using the compiling all potentially-relevant specializations is not practical. JIT-compilation features (i.e., they can run on systems where the ClangJIT provides a well-integrated C++ language extension allow- source code is unavailable). For example, on large HPC deploy- ing template-based specialization to occur during program execu- ments, especially on supercomputers with distributed file systems, tion. -
The Mother of All Demos
UC Irvine Embodiment and Performativity Title The Mother of All Demos Permalink https://escholarship.org/uc/item/91v563kh Author Salamanca, Claudia Publication Date 2009-12-12 Peer reviewed eScholarship.org Powered by the California Digital Library University of California The Mother of All Demos Claudia Salamanca PhD Student, Rhetoric Department University of California Berkeley 1929 Fairview St. Apt B. Berkeley, CA, 94703 1 510 735 1061 [email protected] ABSTRACT guide situated at the mission control and from there he takes us This paper analyses the documentation of the special session into another location: a location that Levy calls the final frontier. delivered by Douglas Engelbart and William English on This description offered by Levy as well as the performance in December 9, 1968 at the Fall Computer Joint Conference in San itself, shows a movement in time and space. The name, “The Francisco. Mother of All Demos,” refers to a temporality under which all previous demos are subcategories of this performance. Furthermore, the name also points to a futurality that is constantly Categories and Subject Descriptors in production: all future demos are also included. What was A.0 [Conference Proceedings] delivered on December 9, 1968 captured the past but also our future. In order to explain this extended temporality, Engelbart’s General Terms demo needs to be addressed not only from the perspective of the Documentation, Performance, Theory. technological breakthroughs but also the modes in which they were delivered. This mode of futurality goes beyond the future simple tense continuously invoked by rhetorics of progress and Keywords technology. The purpose of this paper is to interrogate “The Demo, medium performance, fragmentation, technology, Mother of All Demos” as a performance, inquiring into what this augmentation system, condensation, space, body, mirror, session made and is still making possible. -
Neufuzz: Efficient Fuzzing with Deep Neural Network
Received January 15, 2019, accepted February 6, 2019, date of current version April 2, 2019. Digital Object Identifier 10.1109/ACCESS.2019.2903291 NeuFuzz: Efficient Fuzzing With Deep Neural Network YUNCHAO WANG , ZEHUI WU, QIANG WEI, AND QINGXIAN WANG China National Digital Switching System Engineering and Technological Research Center, Zhengzhou 450000, China Corresponding author: Qiang Wei ([email protected]) This work was supported by National Key R&D Program of China under Grant 2017YFB0802901. ABSTRACT Coverage-guided graybox fuzzing is one of the most popular and effective techniques for discovering vulnerabilities due to its nature of high speed and scalability. However, the existing techniques generally focus on code coverage but not on vulnerable code. These techniques aim to cover as many paths as possible rather than to explore paths that are more likely to be vulnerable. When selecting the seeds to test, the existing fuzzers usually treat all seed inputs equally, ignoring the fact that paths exercised by different seed inputs are not equally vulnerable. This results in wasting time testing uninteresting paths rather than vulnerable paths, thus reducing the efficiency of vulnerability detection. In this paper, we present a solution, NeuFuzz, using the deep neural network to guide intelligent seed selection during graybox fuzzing to alleviate the aforementioned limitation. In particular, the deep neural network is used to learn the hidden vulnerability pattern from a large number of vulnerable and clean program paths to train a prediction model to classify whether paths are vulnerable. The fuzzer then prioritizes seed inputs that are capable of covering the likely to be vulnerable paths and assigns more mutation energy (i.e., the number of inputs to be generated) to these seeds. -
Communicative Capital for Prosthetic Agents Patrick M
This is an unpublished technical report undergoing peer review, not a final typeset article. First draft: July 23, 2016. Current Draft: November 9, 2017. Communicative Capital for Prosthetic Agents Patrick M. Pilarski 1;2∗, Richard S. Sutton 2, Kory W. Mathewson 1;2, Craig Sherstan 1;2, Adam S. R. Parker 1;2, and Ann L. Edwards 1;2 1Division of Physical Medicine and Rehabilitation, Department of Medicine, University of Alberta, Edmonton, AB, Canada 2Reinforcement Learning and Artificial intelligence Laboratory, Department of Computing Science, University of Alberta, Edmonton, AB, Canada Correspondence*: Patrick M. Pilarski, Division of Physical Medicine and Rehabilitation, Department of Medicine, 5-005 Katz Group Centre for Pharmacy and Health Research, University of Alberta, Edmonton, AB, Canada, T6G 2E1. [email protected] ABSTRACT This work presents an overarching perspective on the role that machine intelligence can play in enhancing human abilities, especially those that have been diminished due to injury or illness. As a primary contribution, we develop the hypothesis that assistive devices, and specifically artificial arms and hands, can and should be viewed as agents in order for us to most effectively improve their collaboration with their human users. We believe that increased agency will enable more powerful interactions between human users and next generation prosthetic devices, especially when the sensorimotor space of the prosthetic technology greatly exceeds the conventional control and communication channels available to a prosthetic user. To more concretely examine an agency-based view on prosthetic devices, we propose a new schema for interpreting the capacity of a human-machine collaboration as a function of both the human’s and machine’s degrees of agency. -
Radical Atoms: Beyond Tangible Bits, Toward Transformable Materials Cover Story by Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune
Volume XIX.1 | january + february 2012 Radical Atoms: Beyond Tangible Bits, Toward Transformable Materials Cover Story by Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune Association for Computing Machinery CoVer storY Radical Atoms: Beyond Tangible Bits, Toward Transformable Materials Hiroshi Ishii MIT Media Lab | [email protected] Dávid Lakatos MIT Media Lab | [email protected] Leonardo Bonanni MIT Media Lab | [email protected] Jean-Baptiste Labrune MIT Media Lab | [email protected] Graphical user interfaces (GUIs) appearance dynamically, so they let users see digital informa- are as reconfigurable as pixels on tion only through a screen, as if a screen. Radical Atoms is a vision looking into a pool of water, as for the future of human-material depicted in Figure 1 on page 40. interactions, in which all digital We interact with the forms below information has physical mani- through remote controls, such as festation so that we can interact a mouse, a keyboard, or a touch- directly with it—as if the iceberg screen (Figure 1a). Now imagine had risen from the depths to reveal an iceberg, a mass of ice that pen- its sunken mass (Figure 1c). etrates the surface of the water 2 012 and provides a handle for the mass From GuI to TuI beneath. This metaphor describes Humans have evolved a heightened tangible user interfaces: They act ability to sense and manipulate Februar y as physical manifestations of com- the physical world, yet the digital + putation, allowing us to interact world takes little advantage of our directly with the portion that is capacity for hand-eye coordina- made tangible—the “tip of the ice- tion. -
Master's Thesis
FACULTY OF SCIENCE AND TECHNOLOGY MASTER'S THESIS Study programme/specialisation: Computer Science Spring / Autumn semester, 20......19 Open/Confidential Author: ………………………………………… Nicolas Fløysvik (signature of author) Programme coordinator: Hein Meling Supervisor(s): Hein Meling Title of master's thesis: Using domain restricted types to improve code correctness Credits: 30 Keywords: Domain restrictions, Formal specifications, Number of pages: …………………75 symbolic execution, Rolsyn analyzer, + supplemental material/other: …………0 Stavanger,……………………….15/06/2019 date/year Title page for Master's Thesis Faculty of Science and Technology Domain Restricted Types for Improved Code Correctness Nicolas Fløysvik University of Stavanger Supervised by: Professor Hein Meling University of Stavanger June 2019 Abstract ReDi is a new static analysis tool for improving code correctness. It targets the C# language and is a .NET Roslyn live analyzer providing live analysis feedback to the developers using it. ReDi uses principles from formal specification and symbolic execution to implement methods for performing domain restriction on variables, parameters, and return values. A domain restriction is an invariant implemented as a check function, that can be applied to variables utilizing an annotation referring to the check method. ReDi can also help to prevent runtime exceptions caused by null pointers. ReDi can prevent null exceptions by integrating nullability into the domain of the variables, making it feasible for ReDi to statically keep track of null, and de- tecting variables that may be null when used. ReDi shows promising results with finding inconsistencies and faults in some programming projects, the open source CoreWiki project by Jeff Fritz and several web service API projects for services offered by Innovation Norway. -
The People Who Invented the Internet Source: Wikipedia's History of the Internet
The People Who Invented the Internet Source: Wikipedia's History of the Internet PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Sat, 22 Sep 2012 02:49:54 UTC Contents Articles History of the Internet 1 Barry Appelman 26 Paul Baran 28 Vint Cerf 33 Danny Cohen (engineer) 41 David D. Clark 44 Steve Crocker 45 Donald Davies 47 Douglas Engelbart 49 Charles M. Herzfeld 56 Internet Engineering Task Force 58 Bob Kahn 61 Peter T. Kirstein 65 Leonard Kleinrock 66 John Klensin 70 J. C. R. Licklider 71 Jon Postel 77 Louis Pouzin 80 Lawrence Roberts (scientist) 81 John Romkey 84 Ivan Sutherland 85 Robert Taylor (computer scientist) 89 Ray Tomlinson 92 Oleg Vishnepolsky 94 Phil Zimmermann 96 References Article Sources and Contributors 99 Image Sources, Licenses and Contributors 102 Article Licenses License 103 History of the Internet 1 History of the Internet The history of the Internet began with the development of electronic computers in the 1950s. This began with point-to-point communication between mainframe computers and terminals, expanded to point-to-point connections between computers and then early research into packet switching. Packet switched networks such as ARPANET, Mark I at NPL in the UK, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, where multiple separate networks could be joined together into a network of networks. In 1982 the Internet Protocol Suite (TCP/IP) was standardized and the concept of a world-wide network of fully interconnected TCP/IP networks called the Internet was introduced. -
Die Multiple Identität Der Technik
Kirstin Lenzen Die multiple Identität der Technik | Band 9 Editorial Moderne Gesellschaften sind nur zu begreifen, wenn Technik und Körper konzeptu- ell einbezogen werden. Erst in diesen Materialitäten haben Handlungen einen fes- ten Ort, gewinnen soziale Praktiken und Interaktionen an Dauer und Ausdehnung. Techniken und Körper hingegen ohne gesellschaftliche Praktiken zu beschreiben – seien es diejenigen des experimentellen Herstellens, des instrumentellen Handelns oder des spielerischen Umgangs –, bedeutete den Verzicht auf das sozialtheoreti- sche Erbe von Marx bis Plessner und von Mead bis Foucault sowie den Verlust der kritischen Distanz zu Strategien der Kontrolle und Strukturen der Macht. Die biowissenschaftliche Technisierung des Körpers und die Computer-, Nano- und Netzrevolutionen des Technischen führen diese beiden materiellen Dimensionen des Sozialen nunmehr so eng zusammen, dass Körper und Technik als »sozio-orga- nisch-technische« Hybrid-Konstellationen analysierbar werden. Damit gewinnt aber auch die Frage nach der modernen Gesellschaft an Kompliziertheit: die Grenzen des Sozialen ziehen sich quer durch die Trias Mensch – Tier – Maschine und müssen neu vermessen werden. Die Reihe Technik | Körper | Gesellschaft stellt Studien vor, die sich dieser Frage nach den neuen Grenzziehungen und Interaktionsgeflechten des Sozialen annä- hern. Sie machen dabei den technischen Wandel und die Wirkung hybrider Kon- stellationen, die Prozesse der Innovation und die Inszenierung der Beziehungen zwischen Technik und Gesellschaft und/oder Körper und Gesellschaft zum Thema und denken soziale Praktiken und die Materialitäten von Techniken und Körpern konsequent zusammen. Die Reihe wird herausgegeben von Gesa Lindemann und Werner Rammert. Kirstin Lenzen (Dr.), geb. 1972, forschte als Arbeitswissenschaftlerin am Lehrstuhl und Institut für Arbeitswissenschaft (IAW) der RWTH Aachen sowie als Technik- soziologin an den Instituten für Soziologie der RWTH Aachen und der TU Berlin, wo sie bei Werner Rammert promovierte. -
Arxiv:2106.11534V1 [Cs.DL] 22 Jun 2021 2 Nanjing University of Science and Technology, Nanjing, China 3 University of Southampton, Southampton, U.K
Noname manuscript No. (will be inserted by the editor) Turing Award elites revisited: patterns of productivity, collaboration, authorship and impact Yinyu Jin1 · Sha Yuan1∗ · Zhou Shao2, 4 · Wendy Hall3 · Jie Tang4 Received: date / Accepted: date Abstract The Turing Award is recognized as the most influential and presti- gious award in the field of computer science(CS). With the rise of the science of science (SciSci), a large amount of bibliographic data has been analyzed in an attempt to understand the hidden mechanism of scientific evolution. These include the analysis of the Nobel Prize, including physics, chemistry, medicine, etc. In this article, we extract and analyze the data of 72 Turing Award lau- reates from the complete bibliographic data, fill the gap in the lack of Turing Award analysis, and discover the development characteristics of computer sci- ence as an independent discipline. First, we show most Turing Award laureates have long-term and high-quality educational backgrounds, and more than 61% of them have a degree in mathematics, which indicates that mathematics has played a significant role in the development of computer science. Secondly, the data shows that not all scholars have high productivity and high h-index; that is, the number of publications and h-index is not the leading indicator for evaluating the Turing Award. Third, the average age of awardees has increased from 40 to around 70 in recent years. This may be because new breakthroughs take longer, and some new technologies need time to prove their influence. Besides, we have also found that in the past ten years, international collabo- ration has experienced explosive growth, showing a new paradigm in the form of collaboration. -
Validating Software Via Abstract State Specifications Technical Report
Validating Software via Abstract State Specifications Jonathan S. Ostroff Technical Report EECS-2017-02 July 31 2017 Department of Electrical Engineering and Computer Science 4700 Keele Street, Toronto, Ontario M3J 1P3 Canada VALIDATING SOFTWARE VIA ABSTRACT STATE SPECIFICATIONS, 31 JULY 2017 1 Validating Software via Abstract State Specifications Jonathan S. Ostroff Abstract We describe two tools—ETF and Mathmodels—for developing reliable software by eliciting precise specifica- tions, validating them and verifying that the final software product satisfies the requirements. Mathmodels extends the classical Eiffel contracting notation with the use of mathematical models (sets, sequences, relations, functions, bags) to describe abstract state machines. Classical contracts are incomplete or are low level implementation assertions. Mathmodel contracts provide complete specifications of components and systems that can be verified via runtime contract checking scaling up to large systems. Mathmodels are void safe and have immutable queries (for specifications) as well as relatively efficient mutable commands for the abstract description of algorithms. The ETF tool is used in requirements elicitation to derive specifications, to describe the user interface, to identify the abstract state, and to develop use cases before the software product is constructed. The ETF tool generates code that decouples the user interface from the design (the business logic). The ETF Tool supports the derivation of important system safety invariants which become Mathmodel class invariants in the production code. The ideas can be extended to other contracting languages and frameworks and are placed in the context of best practices for software engineering. We also discuss this work in the light of proposals for software engineering education. -
Support for Algebraic Data Types in Viper Bachelor of Science Thesis Project Description
Support for Algebraic Data Types in Viper Bachelor of Science Thesis Project Description David Rohr Supervised by Arshavir Ter-Gabrielyan Prof. Dr. Peter Müller Chair of Programming Methodology Department of Computer Science ETH Zürich September 30, 2016 1 Introduction Viper [1] is a suite of verication tools developed at ETH Zurich. A central part of Viper is its intermediate verication language, which can be used to describe, for instance, object-oriented programs and their properties (in the form of preconditions, postconditions and assertions) as well as complex data structures like trees and other types of graphs. These data structures can be specied in Viper using recursive predicates [1, pp. 9f.], quantied permissions [1, pp. 12f.] or, potentially, algebraic data types (sometimes referred to as ADTs, but not to be confused with abstract data types), which are commonly used in several functional and multi-paradigm programming languages like Haskell [2], F# [3], Scala [4] or Rust [5]. Although it is not possible to directly reference or dereference their instances (because they are value types), their simplicity and the fact that operations on algebraic data types don't cause any side eects - you could also call them pure - make them a useful tool for the specication of data structures. However, Viper does currently not support the direct (native) declaration of algebraic data types. Instead, they need to be encoded in Viper using custom domains [1, pp. 16f.], which is, depending on the encoded data type, considerably more complicated and error-prone than, for example, a native ADT declaration in Haskell. The main goal of this project is to design and implement a language extension that facilitates the denition and usage of algebraic data types in Viper.