Advancing Deductive Program-Level Verification for Real-World

Total Page:16

File Type:pdf, Size:1020Kb

Advancing Deductive Program-Level Verification for Real-World Advancing Deductive Program-Level Verification for Real-World Application Lessons Learned from an Industrial Case Study Zur Erlangung des akademischen Grades eines Doktors der Naturwissenschaften von der Fakultät für Informatik des Karlsruher Instituts für Technologie (KIT) genehmigte Dissertation von Thorsten Bormer aus Koblenz Datum der mündlichen Prüfung: 23. Oktober 2014 Referent: Prof. Dr. Bernhard Beckert Korreferenten: Dr. Claude Marché Assoc. Prof. Dr. Wolfgang Ahrendt Contents Zusammenfassung ix 1 Introduction 1 1.1 Formal Methods in Practice: the Verisoft Projects . .2 1.2 Current State of Deductive Verification . .4 1.3 Contents and Structure of the Thesis . .4 1.3.1 Part I – Verifying a Paravirtualizing Hypervisor . .5 1.3.2 Part II – Improving the Verification Process . .7 1.4 Contributions . .8 1.5 Previously Published Material . .9 I Deductive Verification of an Industrial Microkernel 12 2 Preliminaries 13 2.1 System Virtualization . 13 2.1.1 Operating Systems . 13 2.1.2 Full System Virtualization . 14 2.1.3 Paravirtualization . 16 2.2 PowerPC . 17 2.2.1 Memory Management . 17 2.2.2 Interrupts . 18 2.3 PikeOS – An Industrial Microkernel for System Virtualization . 19 2.4 Deductive Verification . 24 2.4.1 The VCC System . 24 2.4.2 The KeY System . 31 3 Verifying a Paravirtualizing Microkernel 35 3.1 Correctness of Virtualization Kernels as Simulation Property . 37 3.1.1 Simulations . 37 3.1.2 The PikeOS Simulation Theorem . 38 3.2 Simulation Proofs with VCC . 41 3.2.1 Sequential Systems . 42 3.2.2 Verifying PikeOS System Calls – Sequential Execution . 43 3.2.3 Concurrency . 48 Contents 3.3 Ingredients for Pervasive Correctness . 56 3.4 Related Work . 58 3.4.1 The Hypervisor Verification Project of Verisoft XT . 59 3.4.2 The L4.verified Project . 60 4 Lessons Learned from PikeOS Verification 61 4.1 Formalizing Requirements . 62 4.2 Adding Auxiliary Annotations . 64 4.2.1 Modularization . 65 4.2.2 Abstraction . 67 4.3 Local Verification . 69 4.4 Handling Software Evolution . 71 II Improving Deductive Verification for Real-World Application 73 5 The Auto-Active Verification Paradigm 77 5.1 Introduction . 77 5.1.1 The Possible Outcomes of Invoking an Annotation-based Verifica- tion Tool . 78 5.2 Distinguishing Different Kinds of Annotations . 79 5.2.1 Annotations and their Properties . 79 5.2.2 Annotations and Existence of Proofs . 80 5.2.3 Possible Failures in Authoring Annotations . 84 5.2.4 Improving the Annotation Languages and Methodologies . 86 6 Improving Trust in Verification Systems 88 6.1 Targets of Evaluation . 89 6.2 Test Cases for Program Verification Systems . 90 6.3 Testing Different Properties . 91 6.4 Axiomatization Coverage . 93 6.4.1 Completeness Coverage . 94 6.4.2 Soundness Coverage . 96 6.5 Case Studies . 97 6.5.1 Testing the Axiomatization of VCC . 99 6.5.2 Testing the Calculus Rules of KeY . 100 6.6 Improving Performance of Axiomatization Coverage Computation . 108 6.7 Improving Completeness Coverage of Existing Test Suites . 109 6.8 Related Work . 110 6.9 Conclusions and Future Work . 112 iii Contents 7 Improving Feedback for Verification 114 7.1 Preliminaries . 115 7.1.1 Verification Targets . 115 7.1.2 Annotations and Their Semantics . 116 7.1.3 The Verification Task . 116 7.1.4 The Modular Verification Process . 117 7.1.5 Top-down and Bottom-up Verification . 117 7.1.6 Bounded Software Verification . 118 7.1.7 The Low-Level Bounded Model Checker (LLBMC) . 120 7.2 Deductive Verification of Large Software Systems . 121 7.2.1 Object Orientation . 122 7.3 The Integrated Verification Process . 123 7.4 A Typical Specification Scenario . 128 7.4.1 The Program to be Verified . 128 7.4.2 Local Verification . 130 7.4.3 Global Verification . 132 7.5 Evaluation . 135 7.5.1 Checking Program Correctness with with LLBMC . 135 7.5.2 Improving Performance of Specification Checking . 138 7.6 Related Work . 143 7.7 Conclusion and Future Work . 145 8 Specification Using Abstract Data Types 148 8.1 Specifying Operations on Abstract Data Types – A Simple Case Study . 148 8.1.1 The VCC Approach . 149 8.1.2 The KeY Approach . 151 8.2 Separation of Concerns: Annotation-based Verification and Algebraic Specifications . 153 8.3 Related and Future Work . 156 8.4 Conclusion . 157 9 Conclusions 159 iv List of Figures and Tables 2 Preliminaries 2.1 Operating system privilege separation. 14 2.2 Typical paravirtualization and full virtualization system setups. 16 2.3 A typical PikeOS virtualization setup featuring isolated guest systems. 20 2.4 The VCC toolchain architecture . 25 2.5 VCC integration into the Visual Studio IDE and Model Viewer . 26 2.8 Graphical user interface of the KeY verification system . 34 3 Verifying a Paravirtualizing Microkernel 3.1 Overview of the specification state for PikeOS verification . 40 3.7 Overall PikeOS specification structure . 52 6 Improving Trust in Verification Systems 6.1 The different results of tests for auto-active verification systems and the failures they indicate. 91 6.3 Frequencies of runtimes for test cases of the KeY test suite . 98 6.4 Axiomatization coverage measures for the first VCC experiment . 100 6.5 Taclet coverage data for the KeY test suite by taclet and test case . 103 6.6 Groups of similar KeY test cases in terms of taclet coverage . 105 6.7 Taclet coverage counts for the KeY test suite . 106 6.8 Average test case selectivity by taclet for the KeY test suite . 107 7 Improving Feedback for Verification 7.1 Reasoning about functional correctness of a sample program . 118 7.2 Example for weakest precondition computation . 119 7.3 Comparison of regular VCC workflow with our CEGMAR process. 123 7.4 Excerpt from the requirement specification of copyNoDuplicates and its translation into LLBMC input . 128 7.11 Performance of exhaustive specification checks with LLBMC . 139 7.12 Performance results checking specifications for single problem instances 141 7.13 Comparison of LLBMC performance for checking specifications using different optimization strategies . 142 List of Algorithms and Listings 2 Preliminaries 2.6 Example VCC function contract . 27 2.7 JavaDL proof obligation for method min..................... 33 3 Verifying a Paravirtualizing Microkernel 3.2 Implementation of the PikeOS system call to change thread priority . 45 3.3 Excerpt of abstract PikeOS model (C data structure definition) . 46 3.4 Annotations for verifying function to set thread priority . 47 3.5 Lock data structure specification . 50 3.6 Lock operation contracts . 51 3.8 Annotated PikeOS global kernel information data strucure . 52 3.9 Annotated top-level specification structure . 53 3.10 Annotated implementation for verifying p4_runner_changeprio in the concurrent setting. 56 5 The Auto-Active Verification Paradigm 5.1 Annotated implementation computing the smallest element of an array 85 6 Improving Trust in Verification Systems 6.2 Algorithm to compute completeness axiomatization coverage . 97 6.9 A Java test case for qualified class instance creation . 108 6.10 Algorithm to generate completeness regression tests . 110 7 Improving Feedback for Verification 7.5 Implementation of copyNoDuplicates and insert .............. 129 7.6 Partial method contract for function insert .................. 131 7.7 Method contract of function copyNoDuplicates ................ 131 7.8 Auxiliary annotations for function copyNoDuplicates ............ 133 7.9 Example for wrapper function to establish pre-state for checking method contracts with LLBMC . 136 7.10 Method contract of function copyNoDuplicates for LLBMC . 137 List of Algorithms and Listings 8 Specification.
Recommended publications
  • Experiments in Validating Formal Semantics for C
    Experiments in validating formal semantics for C Sandrine Blazy ENSIIE and INRIA Rocquencourt [email protected] Abstract. This paper reports on the design of adequate on-machine formal semantics for a certified C compiler. This compiler is an optimiz- ing compiler, that targets critical embedded software. It is written and formally verified using the Coq proof assistant. The main structure of the compiler is very strongly conditioned by the choice of the languages of the compiler, and also by the kind of semantics of these languages. 1 Introduction C is still widely used in industry, especially for developing embedded software. The main reason is the control by the C programmer of all the resources that are required for program execution (e.g. memory layout, memory allocation) and that affect performance. C programs can therefore be very efficient, but the price to pay is a programming effort. For instance, using C pointer arithmetic may be required in order to compute the address of a memory cell. However, a consequence of this freedom given to the C programmer is the presence of run-time errors such as buffer overflows and dangling pointers. Such errors may be difficult to detect in programs, because of C type casts and more generally because the C type system is unsafe. Many static analysis tools attempt to detect such errors in order to ensure safety properties that may be ensured by more recent languages such as Java. For instance, CCured is a program transformation system that adds memory safety guarantees to C programs by verifying statically that memory errors cannot oc- cur and by inserting run-time checks where static verification is insufficient [1].
    [Show full text]
  • Introduction to Model Checking and Temporal Logic¹
    Formal Verification Lecture 1: Introduction to Model Checking and Temporal Logic¹ Jacques Fleuriot [email protected] ¹Acknowledgement: Adapted from original material by Paul Jackson, including some additions by Bob Atkey. I Describe formally a specification that we desire the model to satisfy I Check the model satisfies the specification I theorem proving (usually interactive but not necessarily) I Model checking Formal Verification (in a nutshell) I Create a formal model of some system of interest I Hardware I Communication protocol I Software, esp. concurrent software I Check the model satisfies the specification I theorem proving (usually interactive but not necessarily) I Model checking Formal Verification (in a nutshell) I Create a formal model of some system of interest I Hardware I Communication protocol I Software, esp. concurrent software I Describe formally a specification that we desire the model to satisfy Formal Verification (in a nutshell) I Create a formal model of some system of interest I Hardware I Communication protocol I Software, esp. concurrent software I Describe formally a specification that we desire the model to satisfy I Check the model satisfies the specification I theorem proving (usually interactive but not necessarily) I Model checking Introduction to Model Checking I Specifications as Formulas, Programs as Models I Programs are abstracted as Finite State Machines I Formulas are in Temporal Logic 1. For a fixed ϕ, is M j= ϕ true for all M? I Validity of ϕ I This can be done via proof in a theorem prover e.g. Isabelle. 2. For a fixed ϕ, is M j= ϕ true for some M? I Satisfiability 3.
    [Show full text]
  • Formal Modelling of Separation Kernels
    The University of York Department of Computer Science Submitted in part fulfilment for the degree of MSc in Software Engineering. Formal Modelling of Separation Kernels Andrius Velykis 18th September 20091 Supervised by Dr Leo Freitas Number of words = 45327, as counted by detex <report.tex> j wc -w. This report consists of 98 pages in total. This includes the body of the report (without blank pages) and Appendix A, but not Appendices B, C, D, E and F. 1Updated transactional operation proofs, 21st September 2009. Abstract A separation kernel is an architecture for secure applications, which benefits from inherent security of distributed systems. Due to its small size and usage in high-integrity environments, it makes a good target for formal modelling and verification. This project presents results from mechanisation and modelling of separation kernel components: a process table, a process queue and a scheduler. The results have been developed as a part of the pilot project within the international Grand Challenge in Verified Software. This thesis covers full development life-cycle from project initiation through design and evaluation to successful completion. Important findings about kernel properties, formal modelling and design decisions are discussed. The developed formal specification is fully verified and contributes to the pilot project aim of creating a formal kernel model and refining it down to implementation code. Other reusable artefacts, such as general lemmas and a new technique of ensuring transactional properties of operations are defined. The results will be curated within the Verified Software Repository. i Robertai. Aˇci¯u. Acknowledgements I would like to thank Dr Leo Freitas for his supervision, encouragement and getting me hooked on formal methods.
    [Show full text]
  • PROGRAM VERIFICATION Robert S. Boyer and J
    PROGRAM VERIFICATION Robert S. Boyer and J Strother Moore To Appear in the Journal of Automated Reasoning The research reported here was supported by National Science Foundation Grant MCS-8202943 and Office of Naval Research Contract N00014-81-K-0634. Institute for Computing Science and Computer Applications The University of Texas at Austin Austin, Texas 78712 1 Computer programs may be regarded as formal mathematical objects whose properties are subject to mathematical proof. Program verification is the use of formal, mathematical techniques to debug software and software specifications. 1. Code Verification How are the properties of computer programs proved? We discuss three approaches in this article: inductive invariants, functional semantics, and explicit semantics. Because the first approach has received by far the most attention, it has produced the most impressive results to date. However, the field is now moving away from the inductive invariant approach. 1.1. Inductive Assertions The so-called Floyd-Hoare inductive assertion method of program verification [25, 33] has its roots in the classic Goldstine and von Neumann reports [53] and handles the usual kind of programming language, of which FORTRAN is perhaps the best example. In this style of verification, the specifier "annotates" certain points in the program with mathematical assertions that are supposed to describe relations that hold between the program variables and the initial input values each time "control" reaches the annotated point. Among these assertions are some that characterize acceptable input and the desired output. By exploring all possible paths from one assertion to the next and analyzing the effects of intervening program statements it is possible to reduce the correctness of the program to the problem of proving certain derived formulas called verification conditions.
    [Show full text]
  • Formal Verification, Model Checking
    Introduction Modeling Specification Algorithms Conclusions Formal Verification, Model Checking Radek Pel´anek Introduction Modeling Specification Algorithms Conclusions Motivation Formal Methods: Motivation examples of what can go wrong { first lecture non-intuitiveness of concurrency (particularly with shared resources) mutual exclusion adding puzzle Introduction Modeling Specification Algorithms Conclusions Motivation Formal Methods Formal Methods `Formal Methods' refers to mathematically rigorous techniques and tools for specification design verification of software and hardware systems. Introduction Modeling Specification Algorithms Conclusions Motivation Formal Verification Formal Verification Formal verification is the act of proving or disproving the correctness of a system with respect to a certain formal specification or property. Introduction Modeling Specification Algorithms Conclusions Motivation Formal Verification vs Testing formal verification testing finding bugs medium good proving correctness good - cost high small Introduction Modeling Specification Algorithms Conclusions Motivation Types of Bugs likely rare harmless testing not important catastrophic testing, FVFV Introduction Modeling Specification Algorithms Conclusions Motivation Formal Verification Techniques manual human tries to produce a proof of correctness semi-automatic theorem proving automatic algorithm takes a model (program) and a property; decides whether the model satisfies the property We focus on automatic techniques. Introduction Modeling Specification Algorithms Conclusions
    [Show full text]
  • Formal Verification of Diagnosability Via Symbolic Model Checking
    Formal Verification of Diagnosability via Symbolic Model Checking Alessandro Cimatti Charles Pecheur Roberto Cavada ITC-irst RIACS/NASA Ames Research Center ITC-irst Povo, Trento, Italy Moffett Field, CA, U.S.A. Povo, Trento, Italy mailto:[email protected] [email protected] [email protected] Abstract observed system. We propose a new, practical approach to the verification of diagnosability, making the following contribu• This paper addresses the formal verification of di• tions. First, we provide a formal characterization of diagnos• agnosis systems. We tackle the problem of diagnos• ability problem, using the idea of context, that explicitly takes ability: given a partially observable dynamic sys• into account the run-time conditions under which it should be tem, and a diagnosis system observing its evolution possible to acquire certain information. over time, we discuss how to verify (at design time) Second, we show that a diagnosability condition for a given if the diagnosis system will be able to infer (at run• plant is violated if and only if a critical pair can be found. A time) the required information on the hidden part of critical pair is a pair of executions that are indistinguishable the dynamic state. We tackle the problem by look• (i.e. share the same inputs and outputs), but hide conditions ing for pairs of scenarios that are observationally that should be distinguished (for instance, to prevent simple indistinguishable, but lead to situations that are re• failures to stay undetected and degenerate into catastrophic quired to be distinguished. We reduce the problem events). We define the coupled twin model of the plant, and to a model checking problem.
    [Show full text]
  • MILS Architectural Approach Supporting Trustworthiness of the Iiot Solutions
    MILS Architectural Approach Supporting Trustworthiness of the IIoT Solutions An Industrial Internet Consortium Whitepaper Rance J. DeLong (The Open Group); Ekaterina Rudina (Kaspersky) MILS Architectural Approach Context and Overview 1 Context and Overview ...................................................................................................... 4 1.1 Need for Trustworthy System Operation ............................................................................. 5 1.2 What is MILS today .............................................................................................................. 6 1.3 How MILS Addresses Safety ................................................................................................. 7 1.4 How MILS Addresses Security .............................................................................................. 8 1.5 How MILS Supports Reliability, Resilience, and Privacy ........................................................ 9 2 MILS Concepts .................................................................................................................. 9 2.1 Centralized vs Distributed Security Architecture .................................................................. 9 2.1.1 Domain Isolation .................................................................................................................................. 10 2.1.2 Isolation and Information Flow Control ............................................................................................... 11 2.1.3 Separation
    [Show full text]
  • Theorem Proving for Verification
    Theorem Proving for Verification John Harrison Intel Corporation CAV 2008 Princeton 9th July 2008 0 Formal verification Formal verification: mathematically prove the correctness of a design with respect to a mathematical formal specification. Actual requirements 6 Formal specification 6 Design model 6 Actual system 1 Essentials of formal verification The basic steps in formal verification: Formally model the system • Formalize the specification • Prove that the model satisfies the spec • But what formalism should be used? 2 Some typical formalisms Propositional logic, a.k.a. Boolean algebra • Temporal logic (CTL, LTL etc.) • Quantifier-free combinations of first-order arithmetic theories • Full first-order logic • Higher-order logic or first-order logic with arithmetic or set theory • 3 Expressiveness vs. automation There is usually a roughly inverse relationship: The more expressive the formalism, the less the ‘proof’ is amenable to automation. For the simplest formalisms, the proof can be so highly automated that we may not even think of it as ‘theorem proving’ at all. The most expressive formalisms have a decision problem that is not decidable, or even semidecidable. 4 Logical syntax English Formal false ⊥ true ⊤ not p p ¬ p and q p q ∧ p or q p q ∨ p implies q p q ⇒ p iff q p q ⇔ for all x, p x. p ∀ there exists x such that p x. p ∃ 5 Propositional logic Formulas built up from atomic propositions (Boolean variables) and constants , using the propositional connectives , , , and ⊥ ⊤ ¬ ∧ ∨ ⇒ . ⇔ No quantifiers or internal structure to the atomic propositions. 6 Propositional logic Formulas built up from atomic propositions (Boolean variables) and constants , using the propositional connectives , , , and ⊥ ⊤ ¬ ∧ ∨ ⇒ .
    [Show full text]
  • National Information Assurance Partnership
    National Information Assurance Partnership ® TM Common Criteria Evaluation and Validation Scheme Validation Report Green Hills Software INTEGRITY-178B Separation Kernel Report Number: CCEVS-VR-10119-2008 Dated: 01 September 2008 Version: 1.0 National Institute of Standards and Technology National Security Agency Information Technology Laboratory Information Assurance Directorate 100 Bureau Drive 9800 Savage Road STE 6757 Gaithersburg, MD 20899 Fort George G. Meade, MD 20755-6757 VALIDATION REPORT Green Hills Software INTEGRITY-178B Separation Kernel ACKNOWLEDGEMENTS Validation Team Shaun Gilmore Santosh Chokhani Ken Elliott Jerry Myers Paul Bicknell Common Criteria Testing Laboratory SAIC, Inc. Columbia, Maryland ii VALIDATION REPORT Green Hills Software INTEGRITY-178B Separation Kernel Table of Contents 1 Executive Summary................................................................1 1.1 Evaluation Details.............................................................2 2 Identification...........................................................................4 3 Threats to Security ..................................................................5 4 Security Policy........................................................................7 5 Assumptions............................................................................8 5.1 Physical Assumptions .......................................................8 5.2 Personnel Assumptions.....................................................8 5.3 Connectivity Assumptions................................................8
    [Show full text]
  • Safety Standards and WCET Analysis Tools Daniel Kästner, Christian Ferdinand
    Safety Standards and WCET Analysis Tools Daniel Kästner, Christian Ferdinand To cite this version: Daniel Kästner, Christian Ferdinand. Safety Standards and WCET Analysis Tools. Embedded Real Time Software and Systems (ERTS2012), Feb 2012, Toulouse, France. hal-02192406 HAL Id: hal-02192406 https://hal.archives-ouvertes.fr/hal-02192406 Submitted on 23 Jul 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Safety Standards and WCET Analysis Tools Daniel Kastner¨ Christian Ferdinand AbsInt GmbH, Science Park 1, D-66123 Saarbrucken,¨ Germany http://www.absint.com Abstract of-the-art techniques for verifying software safety re- quirements have to be applied to make sure that an appli- In automotive, railway, avionics, automation, and cation is working properly. To do so lies in the responsi- healthcare industries more and more functionality is im- bility of the system designers. Ensuring software safety plemented by embedded software. A failure of safety- is one of the goals of safety standards like DO-178B, critical software may cause high costs or even endan- DO-178C, IEC-61508, ISO-26262, or EN-50128. They ger human beings. Also for applications which are not all require to identify functional and non-functional haz- highly safety-critical, a software failure may necessitate ards and to demonstrate that the software does not vio- expensive updates.
    [Show full text]
  • QWIRE Practice: Formal Verification of Quantum Circuits In
    WIRE Practice: Formal Verification of Quantum Circuits in Coq Q Robert Rand Jennifer Paykin Steve Zdancewic [email protected] [email protected] [email protected] University of Pennsylvania We describe an embedding of the QWIRE quantum circuit language in the Coq proof assistant. This allows programmers to write quantum circuits using high-level abstractions and to prove properties of those circuits using Coq’s theorem proving features. The implementation uses higher-order abstract syntax to represent variable binding and provides a type-checking algorithm for linear wire types, ensuring that quantum circuits are well-formed. We formalize a denotational semantics that interprets QWIRE circuits as superoperators on density matrices, and prove the correctness of some simple quantum programs. 1 Introduction The last few years have witnessed the emergence of lightweight, scalable, and expressive quantum cir- cuit languages such as Quipper [10] and LIQUiS⟩ [22]. These languages adopt the QRAM model of quantum computation, in which a classical computer sends instructions to a quantum computer and re- ceives back measurement results. Quipper and LIQUiS⟩ programs classically produce circuits that can be executed on a quantum computer, simulated on a classical computer, or compiled using classical tech- niques to smaller, faster circuits. Since both languages are embedded inside general-purpose classical host languages (Haskell and F#), they can be used to build useful abstractions on top of quantum circuits, allowing for general purpose quantum programming. As is the case with classical programs, however, quantum programs in these languages will invariably have bugs. Since quantum circuits are inherently expensive to run (either simulated or on a real quantum computer) and are difficult or impossible to debug at runtime, numerous techniques have been developed to verify properties of quantum programs.
    [Show full text]
  • Partitioned System with Xtratum on Powerpc
    Tesina de M´asteren Autom´aticae Inform´aticaIndustrial Partitioned System with XtratuM on PowerPC Author: Rui Zhou Advisor: Prof. Alfons Crespo i Lorente December 2009 Contents 1. Introduction1 1.1. MILS......................................2 1.2. ARINC 653..................................3 1.3. PikeOS.....................................6 1.4. ADEOS....................................7 2. Overview of XtratuM 11 2.1. Virtualization and Hypervisor........................ 11 2.2. XtratuM.................................... 12 3. Overview of PowerPC 16 3.1. POWER.................................... 16 3.2. PowerPC.................................... 17 3.3. PowerPC in Safety-critical.......................... 19 4. Main PowerPC Drivers to Virtualize 20 4.1. Processors................................... 20 4.2. Timer..................................... 21 4.3. Interrupt.................................... 23 4.4. Memory.................................... 24 5. Porting Implementation 25 5.1. Hypercall................................... 26 5.2. Timer..................................... 27 5.3. Interrupt.................................... 28 5.4. Memory.................................... 31 5.5. Partition.................................... 32 6. Benchmark 34 7. Conclusions and Future Work 38 Abstract Nowadays, the diversity of embedded applications has been developed into a new stage with the availability of various new high-performance processors and low cost on-chip memory. As the result of these new advances in hardware, there is a
    [Show full text]