Future Generation Supercomputers II: a Paradigm for Cluster Architecture

Total Page:16

File Type:pdf, Size:1020Kb

Future Generation Supercomputers II: a Paradigm for Cluster Architecture Future Generation Supercomputers II: A Paradigm for Cluster Architecture N.Venkateswaran§,DeepakSrinivasan†, Madhavan Manivannan†, TP Ramnath Sai Sagar† Shyamsundar Gopalakrishnan† VinothKrishnan Elangovan† Arvind M† Prem Kumar Ramesh Karthik Ganesan Viswanath Krishnamurthy Sivaramakrishnan Abstract Blue Gene/L, Red Storm, and ASC Purple clearly marks that these machines although significantly diverse along the In part-I, a novel multi-core node architecture was proposed afore-mentioned design parameters, offer good performance which when employed in a cluster environment would be ca- during ”Grand Challenge Application” execution. But fu- pable of tackling computational complexity associated with ture generation applications might require close coupling of wide class of applications. Furthermore, it was discussed previously independent application models, as highlighted in that by appropriately scaling the architectural specifications, NASA’s report on Earth Science Vision 2030[2], which in- Teraops computing power could be achieved at the node volves simulations on coupled climate models, such as ocean, level. In order to harness the computational power of such atmosphere, biosphere and solid earth. These kind of hybrid a node, we have developed an efficient application execution applications call for simultaneous execution of the compo- model with a competent cluster architectural backbone. In nent applications, since the execution of different applica- this paper we present the novel cluster paradigm, dealing tions on separate clusters may not be prudent in the context with operating system design, parallel programming model of constraints laid down by cluster performance and opera- and cluster interconnection network. Our approach in devel- tional cost. oping the competent cluster design revolves around an ex- There is hence a need, to develop an execution model for ecution model to aid the execution of multiple applications cost effective supercomputing which will envisage simultane- simultaneously on all partitions of the cluster, leading to ous execution of multiple applications on all partitions of a cost sharing across applications. This would be a major ini- single cluster(without sacrificing the performance of individ- tiative towards achieving Cost-Effective Supercomputing. ual application) unlike the current models in which different applications are executed in independent partitions of the 1. Introduction cluster. Future supercomputing models should also address critical design aspects like reliability, fault tolerance and low High performance monolithic clusters, having good perfor- power issues which are increasingly becoming important de- mance and scalability are becoming increasingly popular in sign criterions. This paper along with part-I [3] conjointly the research community for their ability to cater to specific proposes a supercomputing model which is expected to of- application requirements. The level of performance is char- fer superior performance/cost ratio and deal with the rigors acterized by the node architecture, network topology, com- posed by computational requirements of the hybrid applica- piler, parallel programming paradigm and operating system. tions (composed of interdependent applications). Making better design choices would improve the execution This execution model introduces new challenge in the time of large scale applications, which are currently pre- cluster architecture and operating system design for han- dicted to be in Teraflop years. In this paper, we discuss the dling the increased mapping complexity and tracking mech- impact of these design choices on the application’s perfor- anisms during the execution of multiple applications. This mance and provide insights into a supercomputing model execution model introduces new challenges in the cluster which would cater to the demands of the next generation architecture and operating system to enable it to handle grand challenge applications. the increased mapping complexity and tracking, during the Performance Analysis carried out by Adolfy et al.[1] on execution of multiple applications. Also, the programming paradigm adopted should help exploit both node and cluster §N.Venkateswaran Founder Director, WAran Research Founda- Tion (WARFT), Chennai, India. Email:- [email protected] architecture characteristics and ease the programming com- †- WARFT Research Trainee, 2005-2007 plexities involved in application development. However, the −Former WARFT Research Trainee current status given at end support for execution of such diverse workloads encountered of this paper during simultaneous multiple application execution lies in the efficient design of the node architecture. In paper-I [3], we discuss the capability of MIP-based (Memory In Proces- sor) heterogeneous multi-core node architectures to handle SMAG (Simultaneous Multiple AlGorithms) execution aid- ing the proposed execution model by running traces of mul- ACM SIGARCH Computer Architecture News 61 Vol. 35, No. 5, December 2007 tiple applications in the same node. But in the context of the proposed execution model, a new The paper is organized into 4 sections. Section 2 dis- OS paradigm is required for handling the complexities asso- cusses the scope for improvement in the design features of ciated with parallel mapping and data tracking of the huge current generation clusters in order to meet the requirements amount of data associated with the different applications. of performance greedy hybrid applications, also taking into In this scenario, the reliability of the operating system is of consideration the operating cost factor. Section 3 highlights paramount importance as the integrity of IO data sequenc- a cluster model that incorporates all the architectural con- ing is critical, particularly when dealing with million node cepts proposed in section2 and investigates its potential for clusters. Thus the capability of the cluster to stomach the cost effective execution of multiple applications. Section 4 complexities involved in multiple applications’ execution lies addresses the ramification of this model on performance, in an efficient OS design. resource utilization profile and their influence on the perfor- mance/cost relation. 2.3 Parallel Programming Paradigm and Compiler The current Parallel Programming Languages are catego- 2. Design Characteristics of High Performance Clusters rized into Data parallel languages, explicit communication Performance modeling has come a long way in helping re- model and functional languages. These parallel languages searchers characterize cluster design to achieve expected per- either stress on data parallelism as in NESL and C* [5, 6] formance. Different methodologies have been evolved to ac- or the communication model where the user is completely curately compare, analyze and predict the performance of responsible for the creation and management of processes various designs and features such as the node and cluster and their interactions with one another, as in Emerald or architecture, operating system, and programming paradigm COOL [7]. No single language has been developed which that have been identified to play dominant roles[4]. We dis- can handle both data parallelism and communication model cuss these design issues in high performance clusters and efficiently. propose new directions for evolving a cluster model to meet With increasing complexity of the application, the pro- the requirements of future generation applications. gramming model needs to be simple and expressible and also allow programmers to represent complex logic efficiently. For this, the Parallel Programming Language (PPL) model 2.1 Cluster Interconnection Network should be simple and portable form of object-based so that The type of interconnects and the topology adopted af- it can be easily understand, modify and debug than its se- fects the overall performance of the communication network. quential counterpart. These PPL should have constructs Conventional networks use wired network topologies sup- which must be capable of exploit the level of parallelism in- ported by different technologies for implementing large scale herently present in the application matching the underlying HPC (High Performance Cluster) designs.The most popular architecture (ISA of the node architecture). A new PPL choices for network interconnects are Fast Ethernet, Gigabit model of the MIP SCOC cluster incorporating all the above Ethernet, Myricom Myrinet, and InfiniBand. features will be discussed in section 3.6. The communication pattern of massive applications vary dynamically during execution time and each pattern can 3. MODEL FOR NEXT GENERATION SUPERCOMPUT- be served better by employing a particular interconnection ERS topology. If it is possible to dynamically reconfigure the cluster topology to suit the communication requirements of In order to create a design space for supercomputers, the the instant, it would greatly boost the performance of the focus should also be on aspects like power, performance, application execution on the cluster. Although many of the cost and their related tradeoffs. In this section we present currently employed networks have been successful in sat- a conceptual model (fig. 1) for cluster design taking into isfying the high bandwidth requirements, they are unable consideration all the design issues discussed in section 2. to meet the overwhelming degree of scalability required by The cluster model comprises of MIP-paradigm based het- hybrid application execution and the
Recommended publications
  • Programming Paradigms & Object-Oriented
    4.3 (Programming Paradigms & Object-Oriented- Computer Science 9608 Programming) with Majid Tahir Syllabus Content: 4.3.1 Programming paradigms Show understanding of what is meant by a programming paradigm Show understanding of the characteristics of a number of programming paradigms (low- level, imperative (procedural), object-oriented, declarative) – low-level programming Demonstrate an ability to write low-level code that uses various address modes: o immediate, direct, indirect, indexed and relative (see Section 1.4.3 and Section 3.6.2) o imperative programming- see details in Section 2.3 (procedural programming) Object-oriented programming (OOP) o demonstrate an ability to solve a problem by designing appropriate classes o demonstrate an ability to write code that demonstrates the use of classes, inheritance, polymorphism and containment (aggregation) declarative programming o demonstrate an ability to solve a problem by writing appropriate facts and rules based on supplied information o demonstrate an ability to write code that can satisfy a goal using facts and rules Programming paradigms 1 4.3 (Programming Paradigms & Object-Oriented- Computer Science 9608 Programming) with Majid Tahir Programming paradigm: A programming paradigm is a set of programming concepts and is a fundamental style of programming. Each paradigm will support a different way of thinking and problem solving. Paradigms are supported by programming language features. Some programming languages support more than one paradigm. There are many different paradigms, not all mutually exclusive. Here are just a few different paradigms. Low-level programming paradigm The features of Low-level programming languages give us the ability to manipulate the contents of memory addresses and registers directly and exploit the architecture of a given processor.
    [Show full text]
  • Bioconductor: Open Software Development for Computational Biology and Bioinformatics Robert C
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Collection Of Biostatistics Research Archive Bioconductor Project Bioconductor Project Working Papers Year 2004 Paper 1 Bioconductor: Open software development for computational biology and bioinformatics Robert C. Gentleman, Department of Biostatistical Sciences, Dana Farber Can- cer Institute Vincent J. Carey, Channing Laboratory, Brigham and Women’s Hospital Douglas J. Bates, Department of Statistics, University of Wisconsin, Madison Benjamin M. Bolstad, Division of Biostatistics, University of California, Berkeley Marcel Dettling, Seminar for Statistics, ETH, Zurich, CH Sandrine Dudoit, Division of Biostatistics, University of California, Berkeley Byron Ellis, Department of Statistics, Harvard University Laurent Gautier, Center for Biological Sequence Analysis, Technical University of Denmark, DK Yongchao Ge, Department of Biomathematical Sciences, Mount Sinai School of Medicine Jeff Gentry, Department of Biostatistical Sciences, Dana Farber Cancer Institute Kurt Hornik, Computational Statistics Group, Department of Statistics and Math- ematics, Wirtschaftsuniversitat¨ Wien, AT Torsten Hothorn, Institut fuer Medizininformatik, Biometrie und Epidemiologie, Friedrich-Alexander-Universitat Erlangen-Nurnberg, DE Wolfgang Huber, Department for Molecular Genome Analysis (B050), German Cancer Research Center, Heidelberg, DE Stefano Iacus, Department of Economics, University of Milan, IT Rafael Irizarry, Department of Biostatistics, Johns Hopkins University Friedrich Leisch, Institut fur¨ Statistik und Wahrscheinlichkeitstheorie, Technische Universitat¨ Wien, AT Cheng Li, Department of Biostatistical Sciences, Dana Farber Cancer Institute Martin Maechler, Seminar for Statistics, ETH, Zurich, CH Anthony J. Rossini, Department of Medical Education and Biomedical Informat- ics, University of Washington Guenther Sawitzki, Statistisches Labor, Institut fuer Angewandte Mathematik, DE Colin Smith, Department of Molecular Biology, The Scripps Research Institute, San Diego Gordon K.
    [Show full text]
  • Paradigms of Computer Programming
    Paradigms of computer programming l This course aims to teach programming as a unified discipline that covers all programming languages l We cover the essential concepts and techniques in a uniform framework l Second-year university level: requires some programming experience and mathematics (sets, lists, functions) l The course covers four important themes: l Programming paradigms l Mathematical semantics of programming l Data abstraction l Concurrency l Let’s see how this works in practice Hundreds of programming languages are in use... So many, how can we understand them all? l Key insight: languages are based on paradigms, and there are many fewer paradigms than languages l We can understand many languages by learning few paradigms! What is a paradigm? l A programming paradigm is an approach to programming a computer based on a coherent set of principles or a mathematical theory l A program is written to solve problems l Any realistic program needs to solve different kinds of problems l Each kind of problem needs its own paradigm l So we need multiple paradigms and we need to combine them in the same program How can we study multiple paradigms? l How can we study multiple paradigms without studying multiple languages (since most languages only support one, or sometimes two paradigms)? l Each language has its own syntax, its own semantics, its own system, and its own quirks l We could pick three languages, like Java, Erlang, and Haskell, and structure our course around them l This would make the course complicated for no good reason
    [Show full text]
  • The Machine That Builds Itself: How the Strengths of Lisp Family
    Khomtchouk et al. OPINION NOTE The Machine that Builds Itself: How the Strengths of Lisp Family Languages Facilitate Building Complex and Flexible Bioinformatic Models Bohdan B. Khomtchouk1*, Edmund Weitz2 and Claes Wahlestedt1 *Correspondence: [email protected] Abstract 1Center for Therapeutic Innovation and Department of We address the need for expanding the presence of the Lisp family of Psychiatry and Behavioral programming languages in bioinformatics and computational biology research. Sciences, University of Miami Languages of this family, like Common Lisp, Scheme, or Clojure, facilitate the Miller School of Medicine, 1120 NW 14th ST, Miami, FL, USA creation of powerful and flexible software models that are required for complex 33136 and rapidly evolving domains like biology. We will point out several important key Full list of author information is features that distinguish languages of the Lisp family from other programming available at the end of the article languages and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSL): languages which are specialized to a particular area and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the “programmable programming language.” We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and AI research in bioinformatics and computational biology.
    [Show full text]
  • Scripting: Higher- Level Programming for the 21St Century
    . John K. Ousterhout Sun Microsystems Laboratories Scripting: Higher- Cybersquare Level Programming for the 21st Century Increases in computer speed and changes in the application mix are making scripting languages more and more important for the applications of the future. Scripting languages differ from system programming languages in that they are designed for “gluing” applications together. They use typeless approaches to achieve a higher level of programming and more rapid application development than system programming languages. or the past 15 years, a fundamental change has been ated with system programming languages and glued Foccurring in the way people write computer programs. together with scripting languages. However, several The change is a transition from system programming recent trends, such as faster machines, better script- languages such as C or C++ to scripting languages such ing languages, the increasing importance of graphical as Perl or Tcl. Although many people are participat- user interfaces (GUIs) and component architectures, ing in the change, few realize that the change is occur- and the growth of the Internet, have greatly expanded ring and even fewer know why it is happening. This the applicability of scripting languages. These trends article explains why scripting languages will handle will continue over the next decade, with more and many of the programming tasks in the next century more new applications written entirely in scripting better than system programming languages. languages and system programming
    [Show full text]
  • Six Canonical Projects by Rem Koolhaas
    5 Six Canonical Projects by Rem Koolhaas has been part of the international avant-garde since the nineteen-seventies and has been named the Pritzker Rem Koolhaas Architecture Prize for the year 2000. This book, which builds on six canonical projects, traces the discursive practice analyse behind the design methods used by Koolhaas and his office + OMA. It uncovers recurring key themes—such as wall, void, tur montage, trajectory, infrastructure, and shape—that have tek structured this design discourse over the span of Koolhaas’s Essays on the History of Ideas oeuvre. The book moves beyond the six core pieces, as well: It explores how these identified thematic design principles archi manifest in other works by Koolhaas as both practical re- Ingrid Böck applications and further elaborations. In addition to Koolhaas’s individual genius, these textual and material layers are accounted for shaping the very context of his work’s relevance. By comparing the design principles with relevant concepts from the architectural Zeitgeist in which OMA has operated, the study moves beyond its specific subject—Rem Koolhaas—and provides novel insight into the broader history of architectural ideas. Ingrid Böck is a researcher at the Institute of Architectural Theory, Art History and Cultural Studies at the Graz Ingrid Böck University of Technology, Austria. “Despite the prominence and notoriety of Rem Koolhaas … there is not a single piece of scholarly writing coming close to the … length, to the intensity, or to the methodological rigor found in the manuscript
    [Show full text]
  • Comparing Bioinformatics Software Development by Computer Scientists and Biologists: an Exploratory Study
    Comparing Bioinformatics Software Development by Computer Scientists and Biologists: An Exploratory Study Parmit K. Chilana Carole L. Palmer Amy J. Ko The Information School Graduate School of Library The Information School DUB Group and Information Science DUB Group University of Washington University of Illinois at University of Washington [email protected] Urbana-Champaign [email protected] [email protected] Abstract Although research in bioinformatics has soared in the last decade, much of the focus has been on high- We present the results of a study designed to better performance computing, such as optimizing algorithms understand information-seeking activities in and large-scale data storage techniques. Only recently bioinformatics software development by computer have studies on end-user programming [5, 8] and scientists and biologists. We collected data through semi- information activities in bioinformatics [1, 6] started to structured interviews with eight participants from four emerge. Despite these efforts, there still is a gap in our different bioinformatics labs in North America. The understanding of problems in bioinformatics software research focus within these labs ranged from development, how domain knowledge among MBB computational biology to applied molecular biology and and CS experts is exchanged, and how the software biochemistry. The findings indicate that colleagues play a development process in this domain can be improved. significant role in information seeking activities, but there In our exploratory study, we used an information is need for better methods of capturing and retaining use perspective to begin understanding issues in information from them during development. Also, in bioinformatics software development. We conducted terms of online information sources, there is need for in-depth interviews with 8 participants working in 4 more centralization, improved access and organization of bioinformatics labs in North America.
    [Show full text]
  • Technological Advancement in Object Oriented Programming Paradigm for Software Development
    International Journal of Applied Engineering Research ISSN 0973-4562 Volume 14, Number 8 (2019) pp. 1835-1841 © Research India Publications. http://www.ripublication.com Technological Advancement in Object Oriented Programming Paradigm for Software Development Achi Ifeanyi Isaiah1, Agwu Chukwuemeka Odi2, Alo Uzoma Rita3, Anikwe Chioma Verginia4, Okemiri Henry Anaya5 1Department of Maths/Comp Sci/Stats/Info., Faculty of science, Alex Ekwueme University, Ndufu-Alike 2Department of Computer Science, Ebonyi State University-Abakaliki. 3Alex Ekwueme University, Ndufu-Alike, 4Department of Maths/Comp Sci/Stats/Info., Faculty of science, Alex Ekwueme University, Ndufu-Alike 5Department of Maths/Comp Sci/Stats/Info., Faculty of science, Alex Ekwueme University, Ndufu-Alike Abstract and personalization problems. Take for instance, a lot of sophisticated apps are being produced and release in the Object oriented programming paradigm in software market today through the application of OOP. Almost desktop development is one of the most popular methods in the apps are being converted to mobile apps through Java, C++, information technology industry and academia as well as in PHP & MySQL, R, Python etc platform which form the many other forms of engineering design. Software testimony of OOP in the software industries. Software development is a field of engineering that came into existence developer has been revolving using procedural language for owing to the various problems that developers of software the past decade before the advent of OOP. The relationships faced while developing software projects. This paper analyzes between them is that procedural languages focus on the some of the most important technological innovations in algorithm but OOP focuses on the object model itself, object oriented software engineering in recent times.
    [Show full text]
  • Object Oriented Database Management Systems-Concepts
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Global Journal of Computer Science and Technology (GJCST) Global Journal of Computer Science and Technology: C Software & Data Engineering Volume 15 Issue 3 Version 1.0 Year 2015 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals Inc. (USA) Online ISSN: 0975-4172 & Print ISSN: 0975-4350 Object Oriented Database Management Systems-Concepts, Advantages, Limitations and Comparative Study with Relational Database Management Systems By Hardeep Singh Damesha Lovely Professional University, India Abstract- Object Oriented Databases stores data in the form of objects. An Object is something uniquely identifiable which models a real world entity and has got state and behaviour. In Object Oriented based Databases capabilities of Object based paradigm for Programming and databases are combined due remove the limitations of Relational databases and on the demand of some advanced applications. In this paper, need of Object database, approaches for Object database implementation, requirements for database to an Object database, Perspectives of Object database, architecture approaches for Object databases, the achievements and weakness of Object Databases and comparison with relational database are discussed. Keywords: relational databases, object based databases, object and object data model. GJCST-C Classification : F.3.3 ObjectOrientedDatabaseManagementSystemsConceptsAdvantagesLimitationsandComparativeStudywithRelationalDatabaseManagementSystems Strictly as per the compliance and regulations of: © 2015. Hardeep Singh Damesha. This is a research/review paper, distributed under the terms of the Creative Commons Attribution- Noncommercial 3.0 Unported License http://creativecommons.org/licenses/by-nc/3.0/), permitting all non-commercial use, distribution, and reproduction inany medium, provided the original work is properly cited.
    [Show full text]
  • The Graph-Oriented Programming Paradigm
    The Graph-Oriented Programming Paradigm Olivier Rey Copyright ©2016 Olivier Rey [email protected] October 26 2016 Preliminary Version Abstract 1.2 Common Use Cases for Attributed Directed Graph Databases Graph-oriented programming is a new programming paradigm that de- fines a graph-oriented way to build enterprise software, using directed At the time this article is being written, attributed directed attributed graph databases on the backend side. graph databases are mostly used: This programming paradigm is cumulating the benefits of sev- • In Internet social applications, eral other programming paradigms: object-orientation, functional pro- gramming, design by contract, rule-based programming. However, it • For Big Data purposes, is consistent in itself and does not depend on any other programming paradigms. It is also simpler and more intuitive than the other pro- • For recommendation algorithms in retail or assimilated gramming paradigms. Moreover, it shows astonishing properties in businesses, terms of software evolution and extensibility. This programming paradigm enables to develop long lasting busi- • In fraud detection based on pattern matching, ness applications that do not generate any technical debt. It provides a radically different answer to the maintenance and evo- • In some other more restricted areas such as reference data lutions phases compared to other programming paradigms. Its use is management, identity management or network modeling. particularly adapted for applications that must manage high complex- ity, evolving regulations and/or high numbers of business rules. This article will not speak about those common graph With graph-oriented programming, software can evolve structurally database use cases, and we will direct the reader to the mas- without having to redesign it, to migrate the data or to perform non sive documentation available on the Internet on those topics.
    [Show full text]
  • Object Oriented Testing Techniques: Survey and Challenges
    ISSN:2229-6093 Prashant ,Int.J.Computer Technology & Applications,Vol 3 (2), 746-749 Object Oriented Testing Techniques: Survey and Challenges Prashant, Dept. Of I.T., Gurgaon College of Engg., Gurgaon, Haryana. [email protected] Abstract: Object-oriented programs involve many unique excellent structuring mechanism. They allow a features that are not present in their conventional system to be divided into well-defined units, which counterparts. Examples are message passing, may then be implemented separately. Second, classes synchronization, dynamic binding, object instantiation, support information hiding. Third, object-orientation persistence, encapsulation, inheritance, and polymorphism. encourages and supports software reuse. This may be Testing for such program is, therefore, more difficult than that for conventional programs. Object-orientation has achieved either through the simple reuse of a class in rapidly become accepted as the preferred paradigm for a library, or via inheritance, whereby a new class may large-scale system design. In this paper we have discussed be created as an extension of an existing one [2]. about how testing is being carried out in the Object These might cause some types of faults that are Oriented environment. To accommodate this, several new difficult to detect using traditional testing techniques. techniques have been proposed like fault-based techniques, To overcome these deficiencies, it is necessary to Scenario based, Surface structure testing, and Deep adopt an object-oriented testing technique that takes structural testing. these features into account. Keywords-Fault-based Testing, Scenario-based Testing, 2. TROUBLE MAKERS OF OBJECT ORIENTED Surface Structure Testing. SOFTWARE 1. INTRODUCTION Following are trouble makers of OO Software The testing of software is an important means of 2.1 Encapsulation A wrapping up of data and assessing the software to determine its Quality.
    [Show full text]
  • Pdf: Software Testing
    Software Testing Gregory M. Kapfhammer Department of Computer Science Allegheny College [email protected] I shall not deny that the construction of these testing programs has been a major intellectual effort: to convince oneself that one has not overlooked “a relevant state” and to convince oneself that the testing programs generate them all is no simple matter. The encouraging thing is that (as far as we know!) it could be done. Edsger W. Dijkstra [Dijkstra, 1968] 1 Introduction When a program is implemented to provide a concrete representation of an algorithm, the developers of this program are naturally concerned with the correctness and performance of the implementation. Soft- ware engineers must ensure that their software systems achieve an appropriate level of quality. Software verification is the process of ensuring that a program meets its intended specification [Kaner et al., 1993]. One technique that can assist during the specification, design, and implementation of a software system is software verification through correctness proof. Software testing, or the process of assessing the func- tionality and correctness of a program through execution or analysis, is another alternative for verifying a software system. As noted by Bowen, Hinchley, and Geller, software testing can be appropriately used in conjunction with correctness proofs and other types of formal approaches in order to develop high quality software systems [Bowen and Hinchley, 1995, Geller, 1978]. Yet, it is also possible to use software testing techniques in isolation from program correctness proofs or other formal methods. Software testing is not a “silver bullet” that can guarantee the production of high quality software systems.
    [Show full text]