Formalized Structured Analysis Specifications David L
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Introduction Development Lifecycle Model
DeveIopment of a Comprehensive Software Engineering Environment Thomas C. Hartrum Gary B. Lamont Department of Electrical and Computer Engineering Department of Electrical and Computer Engineering School of Engineering School of Engineering Air Force Institute of Technology Air Force Institute of Technology Wright-Patterson AFB, Dayton, Ohio, 45433 Wright-Patterson AFB, Dayton, Ohio, 45433 Abstract The concept of an integrated software development environment The generation of a set of tools for the software lifecycle is a recur- can be realized in two distinct levels. The first level deals with the ring theme in the software engineering literature. The development of access and usage mechanisms for the interactive tools, while the sec- such tools and their integration into a software development environ- ond level concerns the preservation of software development data and ment is a difficult task at best because of the magnitude (number of the relationships between the products of the different software life variables) and the complexity (combinatorics) of the software lifecycle cycle stages. The first level requires that all of the component tools process. An initial development of a global approach was initiated at be resident under one operating system and be accessible through a AFIT in 1982 as the Software Development Workbench (SDW). Also common user interface. The second level dictates the need to store de- other restricted environments have evolved emphasizing Ada and di5 velopment data (requirements specifications, designs, code, test plans tributed processing. Continuing efforts focus on tool development, and procedures, manuals, etc.) in an integrated data base that pre- tool integration, human interfacing (graphics; SADT, DFD, structure serves the relationships between the products of the different life cy- charts, ...), data dictionaries, and testing algorithms. -
Rugby - a Process Model for Continuous Software Engineering
INSTITUT FUR¨ INFORMATIK DER TECHNISCHEN UNIVERSITAT¨ MUNCHEN¨ Forschungs- und Lehreinheit I Angewandte Softwaretechnik Rugby - A Process Model for Continuous Software Engineering Stephan Tobias Krusche Vollstandiger¨ Abdruck der von der Fakultat¨ fur¨ Informatik der Technischen Universitat¨ Munchen¨ zur Erlangung des akademischen Grades eines Doktors der Naturwissenschaften (Dr. rer. nat.) genehmigten Dissertation. Vorsitzender: Univ.-Prof. Dr. Helmut Seidl Prufer¨ der Dissertation: 1. Univ.-Prof. Bernd Brugge,¨ Ph.D. 2. Prof. Dr. Jurgen¨ Borstler,¨ Blekinge Institute of Technology, Karlskrona, Schweden Die Dissertation wurde am 28.01.2016 bei der Technischen Universitat¨ Munchen¨ eingereicht und durch die Fakultat¨ fur¨ Informatik am 29.02.2016 angenommen. Abstract Software is developed in increasingly dynamic environments. Organizations need the capability to deal with uncertainty and to react to unexpected changes in require- ments and technologies. Agile methods already improve the flexibility towards changes and with the emergence of continuous delivery, regular feedback loops have become possible. The abilities to maintain high code quality through reviews, to regularly re- lease software, and to collect and prioritize user feedback, are necessary for con- tinuous software engineering. However, there exists no uniform process model that handles the increasing number of reviews, releases and feedback reports. In this dissertation, we describe Rugby, a process model for continuous software en- gineering that is based on a meta model, which treats development activities as parallel workflows and which allows tailoring, customization and extension. Rugby includes a change model and treats changes as events that activate workflows. It integrates re- view management, release management, and feedback management as workflows. As a consequence, Rugby handles the increasing number of reviews, releases and feedback and at the same time decreases their size and effort. -
INCOSE: the FAR Approach “Functional Analysis/Allocation and Requirements Flowdown Using Use Case Realizations”
in Proceedings of the 16th Intern. Symposium of the International Council on Systems Engineering (INCOSE'06), Orlando, FL, USA, Jul 2006. The FAR Approach – Functional Analysis/Allocation and Requirements Flowdown Using Use Case Realizations Magnus Eriksson1,2, Kjell Borg1, Jürgen Börstler2 1BAE Systems Hägglunds AB 2Umeå University SE-891 82 Örnsköldsvik SE-901 87 Umeå Sweden Sweden {magnus.eriksson, kjell.borg}@baesystems.se {magnuse, jubo}@cs.umu.se Copyright © 2006 by Magnus Eriksson, Kjell Borg and Jürgen Börstler. Published and used by INCOSE with permission. Abstract. This paper describes a use case driven approach for functional analysis/allocation and requirements flowdown. The approach utilizes use cases and use case realizations for functional architecture modeling, which in turn form the basis for design synthesis and requirements flowdown. We refer to this approach as the FAR (Functional Architecture by use case Realizations) approach. The FAR approach is currently applied in several large-scale defense projects within BAE Systems Hägglunds AB and the experience so far is quite positive. The approach is illustrated throughout the paper using the well known Automatic Teller Machine (ATM) example. INTRODUCTION Organizations developing software intensive defense systems, for example vehicles, are today faced with a number of challenges. These challenges are related to the characteristics of both the market place and the system domain. • Systems are growing ever more complex, consisting of tightly integrated mechanical, electrical/electronic and software components. • Systems have very long life spans, typically 30 years or longer. • Due to reduced acquisition budgets, these systems are often developed in relatively short series; ranging from only a few to several hundred units. -
MAKING DATA MODELS READABLE David C
MAKING DATA MODELS READABLE David C. Hay Essential Strategies, Inc. “Confusion and clutter are failures of [drawing] design, not attributes of information. And so the point is to find design strategies that reveal detail and complexity ¾ rather than to fault the data for an excess of complication. Or, worse, to fault viewers for a lack of understanding.” ¾ Edward R. Tufte1 Entity/relationship models (or simply “data models”) are powerful tools for analyzing and representing the structure of an organization. Properly used, they can reveal subtle relationships between elements of a business. They can also form the basis for robust and reliable data base design. Data models have gotten a bad reputation in recent years, however, as many people have found them to be more trouble to produce and less beneficial than promised. Discouraged, people have gone on to other approaches to developing systems ¾ often abandoning modeling altogether, in favor of simply starting with system design. Requirements analysis remains important, however, if systems are ultimately to do something useful for the company. And modeling ¾ especially data modeling ¾ is an essential component of requirements analysis. It is important, therefore, to try to understand why data models have been getting such a “bad rap”. Your author believes, along with Tufte, that the fault lies in the way most people design the models, not in the underlying complexity of what is being represented. An entity/relationship model has two primary objectives: First it represents the analyst’s public understanding of an enterprise, so that the ultimate consumer of a prospective computer system can be sure that the analyst got it right. -
The Roots of Software Engineering*
THE ROOTS OF SOFTWARE ENGINEERING* Michael S. Mahoney Princeton University (CWI Quarterly 3,4(1990), 325-334) At the International Conference on the History of Computing held in Los Alamos in 1976, R.W. Hamming placed his proposed agenda in the title of his paper: "We Would Know What They Thought When They Did It."1 He pleaded for a history of computing that pursued the contextual development of ideas, rather than merely listing names, dates, and places of "firsts". Moreover, he exhorted historians to go beyond the documents to "informed speculation" about the results of undocumented practice. What people actually did and what they thought they were doing may well not be accurately reflected in what they wrote and what they said they were thinking. His own experience had taught him that. Historians of science recognize in Hamming's point what they learned from Thomas Kuhn's Structure of Scientific Revolutions some time ago, namely that the practice of science and the literature of science do not necessarily coincide. Paradigms (or, if you prefer with Kuhn, disciplinary matrices) direct not so much what scientists say as what they do. Hence, to determine the paradigms of past science historians must watch scientists at work practicing their science. We have to reconstruct what they thought from the evidence of what they did, and that work of reconstruction in the history of science has often involved a certain amount of speculation informed by historians' own experience of science. That is all the more the case in the history of technology, where up to the present century the inventor and engineer have \*-as Derek Price once put it\*- "thought with their fingertips", leaving the record of their thinking in the artefacts they have designed rather than in texts they have written. -
Agile Methods: Where Did “Agile Thinking” Come From?
Historical Roots of Agile Methods: Where did “Agile Thinking” Come from? Noura Abbas, Andrew M Gravell, Gary B Wills School of Electronics and Computer Science, University of Southampton Southampton, SO17 1BJ, United Kingdom {na06r, amg, gbw}@ecs.soton.ac.uk Abstract. The appearance of Agile methods has been the most noticeable change to software process thinking in the last fifteen years [16], but in fact many of the “Agile ideas” have been around since 70’s or even before. Many studies and reviews have been conducted about Agile methods which ascribe their emergence as a reaction against traditional methods. In this paper, we argue that although Agile methods are new as a whole, they have strong roots in the history of software engineering. In addition to the iterative and incremental approaches that have been in use since 1957 [21], people who criticised the traditional methods suggested alternative approaches which were actually Agile ideas such as the response to change, customer involvement, and working software over documentation. The authors of this paper believe that education about the history of Agile thinking will help to develop better understanding as well as promoting the use of Agile methods. We therefore present and discuss the reasons behind the development and introduction of Agile methods, as a reaction to traditional methods, as a result of people's experience, and in particular focusing on reusing ideas from history. Keywords: Agile Methods, Software Development, Foundations and Conceptual Studies of Agile Methods. 1 Introduction Many reviews, studies and surveys have been conducted on Agile methods [1, 20, 15, 23, 38, 27, 22]. -
Alignment of Business and Application Layers of Archimate
Alignment of Business and Application Layers of ArchiMate Oleg Svatoš1, Václav Řepa1 1Department of Information Technologies, Faculty of Informatics and Statistics, University of Economics, Prague [email protected], [email protected] Abstract. ArchiMate is widely used standard for modelling enterprise architec- ture and organizes the model in separate yet interconnected layers. Unfortunate- ly the ArchiMate specification provides us with rather brief detail on the rela- tions between the individual layers and the guidelines how to have them aligned. In this paper we focus on the alignment between business and applica- tion layers especially alignment of business processes and application functions, which we see as crucial condition for proper and consistent enterprise architec- ture. For this alignment we propose an extension for ArchiMate based on the development done in Methodology for Modelling and Analysis of Business Process (MMABP). Application of the proposed extension is illustrated on an example and its benefits discussed. Keywords: business process model·Enterprise Architec- ture·ArchiMate·TOGAF·MMABP·functional model·object life cycle 1 Introduction The enterprise architecture plays an important role in current corporate management. Enterprises realized that after the initial spontaneous growth and accumulation of dif- ferent legacy systems over the years, it is necessary to create the living organism forming an enterprise under some kind of control in order to allow managing it. The enterprise architecture is one of the tools which help them to do it. There are different frameworks for capturing the enterprise architectures like FEAF [20], DoDAF [21], ISO 19439 [22], etc. out of which the most popular standard is the TOGAF [14] accompanied by ArchiMate as the modelling language [1]. -
Fundamentals of a Discipline of Computer Program and Systems Design, Edward Yourdon, Larry L
Structured Design: Fundamentals of a Discipline of Computer Program and Systems Design, Edward Yourdon, Larry L. Constantine, Yourdon Press, 1978, 0917072111, 9780917072116, . DOWNLOAD http://bit.ly/1gngZ6L The practical guide to structured systems design , Jones Page, Meilir Page-Jones, 1986, Computers, 354 pages. Structured programming: tutorial, Part 3 tutorial, Victor R. Basili, Terry Baker, IEEE Computer Society, 1975, Computers, 241 pages. Method integration concepts and case studies, Klaus KronlГ¶f, 1993, , 402 pages. Using a systematic approach, it describes the concept of method integration. Presents a detailed investigation of several major techniques, including Information Modeling .... Structured walkthroughs , Edward Yourdon, 1985, , 140 pages. A-Plus Pascal , Richard Tingey, Suzanne Loyer Antink, Feb 1, 1994, , 741 pages. Current and timely, this textbook prepares students for the College Entrance BoardUs Advanced Placement Computer Science Exam.. Structured systems development , Ken Orr, Jan 1, 1977, Computers, 170 pages. Problem solving and computer programming , Peter Grogono, Sharon H. Nelson, 1982, Mathematics, 284 pages. Problem Solving Using Pascal Algorithm Development and Programming Concepts, Romualdas Skvarcius, 1984, Computers, 673 pages. Programming logic structured design, Anne L. Topping, Ian L. Gibbons, Jan 1, 1985, Computers, 285 pages. Reliable software through composite design , Glenford J. Myers, 1975, Computers, 159 pages. Structured Programming Theory and Practice, Richard C. Linger, Jan 1, 1979, Computers, 402 pages. Precision programming. Elements of logical expression. Elements of program expression. Structured programs. Reading structured programs. The correctness of structured programs .... Manernichane musically. Impartial analysis any creative act shows that the whole image causes catharsis, thus, all the listed signs of an archetype and myth confirm that the action mechanisms myth-making mechanisms akin artistic and productive thinking. -
A Comparative Analysis of Structured and Object-Oriented Programming Methods
JASEM ISSN 1119-8362 Full-text Available Online at J. Appl. Sci. Environ. Manage. December, 2008 All rights reserved www.bioline.org.br/ja Vol. 12(4) 41 - 46 A Comparative Analysis of Structured and Object-Oriented Programming Methods ASAGBA, PRINCE OGHENEKARO; OGHENEOVO, EDWARD E. CPN, MNCS. Department of Computer Science, University of Port Harcourt, Port Harcourt, Nigeria. [email protected], [email protected]. 08056023566 ABSTRACT: The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today’s programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a method of organizing and coding programs that can provide easy understanding and modification, whereas object- oriented programming (OOP) consists of a set of objects, which can vary dynamically, and which can execute by acting and reacting to each other, in much the same way that a real-world process proceeds (the interaction of real- world objects). An object-oriented approach makes programs more intuitive to design, faster to develop, more amenable to modifications, and easier to understand. With the traditional, procedural-oriented/structured programming, a program describes a series of steps to be performed (an algorithm). In the object-oriented view of programming, instead of programs consisting of sets of data loosely coupled to many different procedures, object- oriented programs -
Leveraging Software Development Approaches in Systems Engineering
Raytheon Leveraging Software Development Approaches in Systems Engineering Rick Steiner Engineering Fellow Raytheon Integrated Defense Systems [email protected] 6 May 2004 Naval Postgraduate School SI4000 Project Seminar Copyright © 2003 Raytheon Company UNPUBLISHED WORK ALL RIGHTS RESERVED 1 We’re going to talk about: Raytheon • Why Software Tools exist, why Systems Engineers should care • Software vs. SE as a discipline – key differences • The importance of requirements – Different requirement/system development approaches – Pros & cons of each, and how they relate to software approaches • How Use Cases relate to Requirements – Hints on how to manage use case development • How Object Oriented Design relates to Functional Analysis – or not! • What graphical languages can help (UML, SysML) • The promise of Model Driven Architecture (MDA) Copyright © 2003 Raytheon Company UNPUBLISHED WORK ALL RIGHTS RESERVED 2 Software Development Crisis Raytheon • In the 1980’s, software development underwent a crisis: – Software was RAPIDLY proliferating – Software was becoming very complex • Software on top of Software (OS, Application) • Software talking to Software (interfaces) – Software development delays were holding up system delivery – Software was becoming very expensive to develop and maintain – Software development effort was becoming very hard to estimate – Software reliability was becoming problematic – Existing techniques were proving inadequate to manage the problem • Reasons: – Economics • Processing hardware (silicon) got cheap – -
Chapter 1: Introduction
Just Enough Structured Analysis Chapter 1: Introduction “The beginnings and endings of all human undertakings are untidy, the building of a house, the writing of a novel, the demolition of a bridge, and, eminently, the finish of a voyage.” — John Galsworthy Over the River, 1933 www.yourdon.com ©2006 Ed Yourdon - rev. 051406 In this chapter, you will learn: 1. Why systems analysis is interesting; 2. Why systems analysis is more difficult than programming; and 3. Why it is important to be familiar with systems analysis. Chances are that you groaned when you first picked up this book, seeing how heavy and thick it was. The prospect of reading such a long, technical book is enough to make anyone gloomy; fortunately, just as long journeys take place one day at a time, and ultimately one step at a time, so long books get read one chapter at a time, and ultimately one sentence at a time. 1.1 Why is systems analysis interesting? Long books are often dull; fortunately, the subject matter of this book — systems analysis — is interesting. In fact, systems analysis is more interesting than anything I know, with the possible exception of sex and some rare vintages of Australian wine. Without a doubt, it is more interesting than computer programming (not that programming is dull) because it involves studying the interactions of people, and disparate groups of people, and computers and organizations. As Tom DeMarco said in his delightful book, Structured Analysis and Systems Specification (DeMarco, 1978), [systems] analysis is frustrating, full of complex interpersonal relationships, indefinite, and difficult. -
Edward Yourdon
EDWARD YOURDON EDWARD YOURDON is an internationally-recognized computer consultant, as well as the author of more than two dozen books, including Byte Wars, Managing High-Intensity Internet Projects, Death March, Rise and Resurrection of the American Programmer, and Decline and Fall of the American Programmer. His latest book, Outsource: competing in the global productivity race, discusses both current and future trends in offshore outsourcing, and provides practical strategies for individuals, small businesses, and the nation to cope with this unstoppable tidal wave. According to the December 1999 issue of Crosstalk: The Journal of Defense Software Engineering, Ed Yourdon is one of the ten most influential men and women in the software field. In June 1997, he was inducted into the Computer Hall of Fame, along with such notables as Charles Babbage, Seymour Cray, James Martin, Grace Hopper, Gerald Weinberg, and Bill Gates. Ed is widely known as the lead developer of the structured analysis/design methods of the 1970s, as well as a co-developer of the Yourdon/Whitehead method of object-oriented analysis/design and the Coad/Yourdon OO methodology in the late 1980s and 1990s. He was awarded a Certificate of Merit by the Second International Workshop on Computer-Aided Software Engineering in 1988, for his contributions to the promotion of Structured Methods for the improvement of Information Systems Development, leading to the CASE field. He was selected as an Honored Member of Who’s Who in the Computer Industry in 1989. And he was given the Productivity Award in 1992 by Computer Language magazine, for his book Decline and Fall of the American Programmer.