10.2.2 IDEF0 Lessons Learned

Total Page:16

File Type:pdf, Size:1020Kb

10.2.2 IDEF0 Lessons Learned IDEF0 Lessons Learned Darold K. Smith UGS Corp, Systems Engineering Consulting Services 4088 County Road 3326; Greenville, TX USA +1.903.883.0781 [email protected] Copyright © 2006 by Darold K. Smith. Published and used by INCOSE with permission. Abstract. The IDEF0 methodology is becoming widely used as a tool in all types of systems. If a program is considering using IDEF0, there are serious pitfalls to avoid to assure that using IDEF0 provides as accurate and complete system definition that IDEF0 is capable of supporting. The pitfalls are illustrated with three case histories to support the lessons learned. Some lessons apply to other methodologies beyond IDEF0. Although IDEF0 has been in use for many years, it is still being applied to new programs and the lessons learned are timely. IDEF0 Background Readers searching for a functional analysis methodology find many favorable statements about IDEF0 methodology and its use for US government projects. Some of these are excerpted below for background reference. "In 1991 the National Institute of Standards and Technology (NIST) received support from the U.S. Department of Defense, Office of Corporate Information Management (DoD/CIM), to develop one or more Federal Information Processing Standards (FIPS) for modeling techniques. The techniques selected were IDEF0 (FIPS-183 1993) for function modeling and IDEF1X (FIPS-184 1993) for information modeling.1" "IDEF0 (Integration DEFinition language 0) is based on SADT™ (Structured Analysis and Design Technique™), developed by Douglas T. Ross and SofTech, Inc. In its original form, IDEF0 includes both a definition of a graphical modeling language (syntax and semantics) and a description of a comprehensive methodology for developing models2". IDEF0 is the preferred functional / process modeling methodology cited in the IDEF0 standard and is the graphical diagramming tool used in the Data Standardization Procedures (DoD8320 1998). US Government projects, including Department of Defense (DoD), one of which is the DoD Architecture Frameworks (DoDAF), while not requiring IDEF0 for graphical modeling, use it to illustrate system views. According to the IDEF0 Standard3, "IDEF0 is a modeling technique based on combined graphics and text that are presented in an organized and systematic way to gain understanding, support analysis, provide logic for potential changes, specify requirements, or support systems level design and integration activities. An IDEF0 model is composed of a hierarchical series of diagrams that gradually display increasing levels of detail describing functions and their interfaces within the context of a system. …" 1 IDEF0 Background, p v. 2 IDEF0 IDEF0 Approach, p vii" 3 IDEF0 3.1 Model Concepts, p 19. 1382 "IDEF0 is an engineering technique for performing and managing needs analysis, benefits analysis, requirements definition, functional analysis, systems design, maintenance, and baselines for continuous improvement. IDEF0 models provide a "blueprint" of functions and their interfaces that must be captured and understood in order to make systems engineering decisions that are logical, affordable, integratable and achievable. The IDEF0 model reflects how system functions interrelate and operate just as the blueprint of a product reflects how the different pieces of a product fit together. When used in a systematic way, IDEF0 provides a systems engineering approach to: "1. Performing systems analysis and design at all levels, for systems composed of people, machines, materials, computers and information of all varieties - the entire enterprise, a system, or a subject area …." From the front matter4, "As a function modeling language, IDEF0 has the following characteristics: "1. It is a coherent and simple language, providing for rigorous and precise expression, and promoting consistency of usage and interpretation. (emphasis added)…" IDEF0 Standard Content: The IDEF0 standard contains these sections: 1. Section 3. IDEF0 Models: IDEF0 Model Syntax: Box (activity), Arrows (data), and Labeling Diagramming: Functional Decomposition, Arrow Rules, and Node Trees 2. Appendix A, IDEF0 Concepts: Progressive Decomposition and Disciplined Teamwork 3. Appendix B, User's Guide to Creating IDEF0 diagrams: Applying standards in Section 3. 4. Appendix C, Review Cycle Procedures and Forms: Create Review Materials (Kit) IDEF Model Walk-Through Procedures Tool Availability: One can create IDEF0 diagrams with any basic drawing package, such as Microsoft PowerPoint™, by following the syntax conventions in the standard. A search of the internet for IDEF0 Diagramming Tools produces scores of hits, ranging from basic Visio™ templates to specialized IDEF0 tools. The simpler tools, such as the Visio shapes provided by Microsoft, provide a template of shapes for creating IDEF0 diagrams but that have no methodology enforcement. Users can, through the Visio API capabilities, customize stencils that enforce the IDEF0 standards to a degree. High end specialized IDEF0 tools provide high adherence to the IDEF0 standard and provide "bundling" control of "arrows" (flow) between functions. Methodology Selection Rationale: Based on the above information, many projects select IDEF0 as the functional decomposition methodology. There is also a tendency to select free or low-cost tools because of budget constraints and no perceived need for training, particularly when program managers don't appreciate the need for producing high-quality systems engineering work products. 4 IDEF0 IDEF Approach, p vii. 1383 IDFEF0 Limitations SADT, as developed by Ross, includes two complementary methodologies, activity modeling and data modeling. IDEF0 focuses on the activity modeling. As a source of confusion, one of the few books on SADT (Marca 1988) uses “SADT” interchangeably with IDEF0 but the only passing reference to data modeling is in the preface (DRoss 1988). Thus, in spite of the statements about rigor and precision in the IDEF0 standard as described in the background information above, a number of key elements of the SADT methodology that did support rigor were stripped out in IDEF0, the most significant is data modeling. Several important concepts are not embraced in the IDEF0 standard: Data Store and Data Dictionary. Data, in the context of this paper, is any entity that is an output or input of any function5 and is essential for properly defining or understanding the behavior of the process within the context of the system of interest. IDEF0 classifies data flows into one of four classes, Input, Control, Output, or Mechanism (ICOM) with associated rules of how they are represented and used in activity diagrams6. So what about data store and data dictionary? Data Store: Data stores provide a mechanism for temporarily or permanently storing data for future access by one or more activity. A data store represents a repository that provides the capability of future retrieval of data. Examples of data stores are paper records, computer memory (dynamic and programmable memory, disk and tape storage), databases, physical storage bins for manufacturing processes, energy storage devices (spring, battery, momentum, heat), etc. This lack of data store in IDEF0 as a shortcoming is cited in DoDAF V II (DoDAF VII 2003): "A unique element of a SV-4 not found in an IDEF0 activity modeling is a system data store, which is used as the source or destination (sink) of an information flow in the form of an information repository. For convenience and consistency, DATA-STORE has been incorporated in the CADM as an additional subtype of PROCESS-ACTIVITY.7" Data Dictionary: The concept of a data dictionary was popularized in 1978 by the book Structured Analysis and System Specification (DeMarco 1978). There are two types of data flows, composite and primitive8. It is essential that each composite data flow in the system be defined by its constituent composite and / or primitive data flows and each primitive data flow be unambiguously defined for the purposes intended in the system. In IDEF0, a data dictionary capability is alluded to in the topic Branching Arrows and bundling and unbundling of arrows9 but practitioners that are not familiar with the data dictionary concept are likely to not recognize the concept at all. Data Flow Balancing: Data flow balancing is the process of assuring that all data flows that enter or exit a diagram are accounted for on the corresponding activity on the parent diagram and all data flows that enter or exit a function on the diagram are accounted for in any child diagram. For IDEF0 activity diagrams, this process includes tunneled10 data flows. Although not completely omitted in the IDEF0 standard, the only discussion about this concept 5 Function in this paper includes process and any IDEF0 activity. 6 Inputs are considered consumed or transformed by a function to produce the output(s) of the function. 7 DoDAF V II, p 5-31, CADM Support for Systems Functionality Description (SV-4). 8 Primitive data flows are those that have the lowest level of definition in the system. A primitive data flow for system functional decomposition may be left as a non-elemental definition, i.e., not sufficiently detailed to implement detail design, but sufficient to derive an unambiguous detailed design from. 9 IDEF0 3.3.2.5, p 23, Branching Arrows text and Figure 11 Arrow Fork and Join Structures. 1384 in the IDEF0 standard is in an appendix describing the review process to11 "…test the arrow interface from the parent to the child. "Criteria for acceptance: 1. There are no missing or extra interface arrows. 2. Boundary arrows are labeled with the proper ICOM codes. 3. Child arrow labels are the same or an elaboration of its parent's matching arrow. Labels convey the correct and complete arrow contents. 4. Examination of the connecting arrows reveal no problems in the parent diagram. (An added interface may create a misunderstanding of the message conveyed by the parent.)" Case Histories Three brief case histories are presented that are the source for illustrating the IDEF0 lessons learned. Case 1: Company X Applies IDEF0 to Business Expansion Plans Company X is behind schedule developing requirements for a new and large program.
Recommended publications
  • INCOSE: the FAR Approach “Functional Analysis/Allocation and Requirements Flowdown Using Use Case Realizations”
    in Proceedings of the 16th Intern. Symposium of the International Council on Systems Engineering (INCOSE'06), Orlando, FL, USA, Jul 2006. The FAR Approach – Functional Analysis/Allocation and Requirements Flowdown Using Use Case Realizations Magnus Eriksson1,2, Kjell Borg1, Jürgen Börstler2 1BAE Systems Hägglunds AB 2Umeå University SE-891 82 Örnsköldsvik SE-901 87 Umeå Sweden Sweden {magnus.eriksson, kjell.borg}@baesystems.se {magnuse, jubo}@cs.umu.se Copyright © 2006 by Magnus Eriksson, Kjell Borg and Jürgen Börstler. Published and used by INCOSE with permission. Abstract. This paper describes a use case driven approach for functional analysis/allocation and requirements flowdown. The approach utilizes use cases and use case realizations for functional architecture modeling, which in turn form the basis for design synthesis and requirements flowdown. We refer to this approach as the FAR (Functional Architecture by use case Realizations) approach. The FAR approach is currently applied in several large-scale defense projects within BAE Systems Hägglunds AB and the experience so far is quite positive. The approach is illustrated throughout the paper using the well known Automatic Teller Machine (ATM) example. INTRODUCTION Organizations developing software intensive defense systems, for example vehicles, are today faced with a number of challenges. These challenges are related to the characteristics of both the market place and the system domain. • Systems are growing ever more complex, consisting of tightly integrated mechanical, electrical/electronic and software components. • Systems have very long life spans, typically 30 years or longer. • Due to reduced acquisition budgets, these systems are often developed in relatively short series; ranging from only a few to several hundred units.
    [Show full text]
  • Integration Definition for Function Modeling (IDEF0)
    NIST U.S. DEPARTMENT OF COMMERCE PUBLICATIONS £ Technology Administration National Institute of Standards and Technology FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) » Category: Software Standard SUBCATEGORY: MODELING TECHNIQUES 1993 December 21 183 PUB FIPS JK- 45C .AS A3 //I S3 IS 93 FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) Category: Software Standard Subcategory: Modeling Techniques Computer Systems Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 Issued December 21, 1993 U.S. Department of Commerce Ronald H. Brown, Secretary Technology Administration Mary L. Good, Under Secretary for Technology National Institute of Standards and Technology Arati Prabhakar, Director Foreword The Federal Information Processing Standards Publication Series of the National Institute of Standards and Technology (NIST) is the official publication relating to standards and guidelines adopted and promulgated under the provisions of Section 111 (d) of the Federal Property and Administrative Services Act of 1949 as amended by the Computer Security Act of 1987, Public Law 100-235. These mandates have given the Secretary of Commerce and NIST important responsibilities for improving the utilization and management of computer and related telecommunications systems in the Federal Government. The NIST, through its Computer Systems Laboratory, provides leadership, technical guidance,
    [Show full text]
  • The Roots of Software Engineering*
    THE ROOTS OF SOFTWARE ENGINEERING* Michael S. Mahoney Princeton University (CWI Quarterly 3,4(1990), 325-334) At the International Conference on the History of Computing held in Los Alamos in 1976, R.W. Hamming placed his proposed agenda in the title of his paper: "We Would Know What They Thought When They Did It."1 He pleaded for a history of computing that pursued the contextual development of ideas, rather than merely listing names, dates, and places of "firsts". Moreover, he exhorted historians to go beyond the documents to "informed speculation" about the results of undocumented practice. What people actually did and what they thought they were doing may well not be accurately reflected in what they wrote and what they said they were thinking. His own experience had taught him that. Historians of science recognize in Hamming's point what they learned from Thomas Kuhn's Structure of Scientific Revolutions some time ago, namely that the practice of science and the literature of science do not necessarily coincide. Paradigms (or, if you prefer with Kuhn, disciplinary matrices) direct not so much what scientists say as what they do. Hence, to determine the paradigms of past science historians must watch scientists at work practicing their science. We have to reconstruct what they thought from the evidence of what they did, and that work of reconstruction in the history of science has often involved a certain amount of speculation informed by historians' own experience of science. That is all the more the case in the history of technology, where up to the present century the inventor and engineer have \*-as Derek Price once put it\*- "thought with their fingertips", leaving the record of their thinking in the artefacts they have designed rather than in texts they have written.
    [Show full text]
  • Identifying and Defining Relationships: Techniques for Improving Student Systemic Thinking
    AC 2011-897: IDENTIFYING AND DEFINING RELATIONSHIPS: TECH- NIQUES FOR IMPROVING STUDENT SYSTEMIC THINKING Cecelia M. Wigal, University of Tennessee, Chattanooga Cecelia M. Wigal received her Ph.D. in 1998 from Northwestern University and is presently a Professor of Engineering and Assistant Dean of the College of Engineering and Computer Science at the University of Tennessee at Chattanooga (UTC). Her primary areas of interest and expertise include complex process and system analysis, process improvement analysis, and information system analysis with respect to usability and effectiveness. Dr. Wigal is also interested in engineering education reform to address present and future student and national and international needs. c American Society for Engineering Education, 2011 Identifying and Defining Relationships: Techniques for Improving Student Systemic Thinking Abstract ABET, Inc. is looking for graduating undergraduate engineering students who are systems thinkers. However, genuine systems thinking is contrary to the traditional practice of using linear thinking to help solve design problems often used by students and many practitioners. Linear thinking has a tendency to compartmentalize solution options and minimize recognition of relationships between solutions and their elements. Systems thinking, however, has the ability to define the whole system, including its environment, objectives, and parts (subsystems), both static and dynamic, by their relationships. The work discussed here describes two means of introducing freshman engineering students to thinking systemically or holistically when understanding and defining problems. Specifically, the modeling techniques of Rich Pictures and an instructor generated modified IDEF0 model are discussed. These techniques have roles in many applications. In this case they are discussed in regards to their application to the design process.
    [Show full text]
  • Modelling, Analysis and Design of Computer Integrated Manueactur1ng Systems
    MODELLING, ANALYSIS AND DESIGN OF COMPUTER INTEGRATED MANUEACTUR1NG SYSTEMS Volume I of II ABDULRAHMAN MUSLLABAB ABDULLAH AL-AILMARJ October-1998 A thesis submitted for the DEGREE OP DOCTOR OF.PHILOSOPHY MECHANICAL ENGINEERING DEPARTMENT, THE UNIVERSITY OF SHEFFIELD 3n ti]S 5íamc of Allai]. ¿Hoot (gractouo. iHHoßt ¿Merciful. ACKNOWLEDGEMENTS I would like to express my appreciation and thanks to my supervisor Professor Keith Ridgway for devoting freely of his time to read, discuss, and guide this research, and for his assistance in selecting the research topic, obtaining special reference materials, and contacting industrial collaborations. His advice has been much appreciated and I am very grateful. I would like to thank Mr Bruce Lake at Brook Hansen Motors who has patiently answered my questions during the case study. Finally, I would like to thank my family for their constant understanding, support and patience. l To my parents, my wife and my son. ABSTRACT In the present climate of global competition, manufacturing organisations consider and seek strategies, means and tools to assist them to stay competitive. Computer Integrated Manufacturing (CIM) offers a number of potential opportunities for improving manufacturing systems. However, a number of researchers have reported the difficulties which arise during the analysis, design and implementation of CIM due to a lack of effective modelling methodologies and techniques and the complexity of the systems. The work reported in this thesis is related to the development of an integrated modelling method to support the analysis and design of advanced manufacturing systems. A survey of various modelling methods and techniques is carried out. The methods SSADM, IDEFO, IDEF1X, IDEF3, IDEF4, OOM, SADT, GRAI, PN, 10A MERISE, GIM and SIMULATION are reviewed.
    [Show full text]
  • A Method for Business Process Model Analysis and Improvement
    A Method for Business Process Model Analysis and Improvement Andrii Kopp[0000-0002-3189-5623] and Dmytro Orlovskyi[0000-0002-8261-2988] National Technical University “KhPI”, Kyrpychova str. 2, 61002 Kharkiv, Ukraine {kopp93, orlovskyi.dm}@gmail.com Abstract. Since business process modeling is considered as the foundation of Business Process Management, it is required to design understandable and mod- ifiable process models used to analyze and improve depicted business process- es. Therefore, this article proposes a method for business process model analy- sis and improvement. The lifecycle of Business Process Management from business process modeling to applying the Business Intelligence and process mining techniques is considered. Existing approaches to business process model analysis are reviewed. Proposed method is based on best practices in business process modeling, process model metrics, and corresponding thresholds. The usage of business process model metrics and thresholds to formalize process modeling guidelines is outlined, as well as the procedure of business process model analysis and improvement is shown. The application of Business Intelli- gence techniques to support the proposed method is demonstrated. Keywords: Business Process Management, Business Process Modeling, Pro- cess Model Analysis, Process Model Improvement. 1 Introduction Today Business Process Management (BPM) is one of the most popular management concepts. It is based on the set of methods and tools used to design, analyze, improve, and automate organizational business processes. In its turn, business process is a structured set of activities that takes one or more kinds of input and produces a prod- uct or service valuable for a particular customer [1]. According to professor van der Aalst [2], BPM combines knowledge from infor- mation technology and knowledge from management sciences and applies this to operational business processes.
    [Show full text]
  • A Comparative Analysis of Structured and Object-Oriented Programming Methods
    JASEM ISSN 1119-8362 Full-text Available Online at J. Appl. Sci. Environ. Manage. December, 2008 All rights reserved www.bioline.org.br/ja Vol. 12(4) 41 - 46 A Comparative Analysis of Structured and Object-Oriented Programming Methods ASAGBA, PRINCE OGHENEKARO; OGHENEOVO, EDWARD E. CPN, MNCS. Department of Computer Science, University of Port Harcourt, Port Harcourt, Nigeria. [email protected], [email protected]. 08056023566 ABSTRACT: The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today’s programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a method of organizing and coding programs that can provide easy understanding and modification, whereas object- oriented programming (OOP) consists of a set of objects, which can vary dynamically, and which can execute by acting and reacting to each other, in much the same way that a real-world process proceeds (the interaction of real- world objects). An object-oriented approach makes programs more intuitive to design, faster to develop, more amenable to modifications, and easier to understand. With the traditional, procedural-oriented/structured programming, a program describes a series of steps to be performed (an algorithm). In the object-oriented view of programming, instead of programs consisting of sets of data loosely coupled to many different procedures, object- oriented programs
    [Show full text]
  • Leveraging Software Development Approaches in Systems Engineering
    Raytheon Leveraging Software Development Approaches in Systems Engineering Rick Steiner Engineering Fellow Raytheon Integrated Defense Systems [email protected] 6 May 2004 Naval Postgraduate School SI4000 Project Seminar Copyright © 2003 Raytheon Company UNPUBLISHED WORK ALL RIGHTS RESERVED 1 We’re going to talk about: Raytheon • Why Software Tools exist, why Systems Engineers should care • Software vs. SE as a discipline – key differences • The importance of requirements – Different requirement/system development approaches – Pros & cons of each, and how they relate to software approaches • How Use Cases relate to Requirements – Hints on how to manage use case development • How Object Oriented Design relates to Functional Analysis – or not! • What graphical languages can help (UML, SysML) • The promise of Model Driven Architecture (MDA) Copyright © 2003 Raytheon Company UNPUBLISHED WORK ALL RIGHTS RESERVED 2 Software Development Crisis Raytheon • In the 1980’s, software development underwent a crisis: – Software was RAPIDLY proliferating – Software was becoming very complex • Software on top of Software (OS, Application) • Software talking to Software (interfaces) – Software development delays were holding up system delivery – Software was becoming very expensive to develop and maintain – Software development effort was becoming very hard to estimate – Software reliability was becoming problematic – Existing techniques were proving inadequate to manage the problem • Reasons: – Economics • Processing hardware (silicon) got cheap –
    [Show full text]
  • IDEF0 Includes Both a Definition of a Graphical Model
    Faculty of Information Technologies Basic concepts of Business Process Modeling Šárka Květoňová Brno, 2006 Outline • Terminology • A Brief History on Business Activity • The basic approaches to Business Process Modeling – Formal x informal methods of specification and analysis – Verification of the informal defined methods • Software tools for Business Process specification and analysis (ARIS, BP Studio, JARP, Woflan, Flowmake, Oracle BPEL Process Manager, etc.) • Standards of BPM in context of BP Challenges – BPEL, BPMN, BPML, BPQL, BPSM, XPDL, XLANG, WSCI etc. • Mapping of UML Business Processes to BPEL4WS • Conclusions Terminology of BPM • Business Process is a set of one or more linked procedures or activities which collectively realize a business objective or policy goal, normally within the context of an organizational structure defining functional roles and relationships. • Business process model – is the representation of a business process in a form which supports automated manipulation, such as modeling or enactment. The process definition consists of a network of activities and their relationships, criteria to indicate the start and termination of the process, and information about the individual activities, such as participants, associated data, etc. • Workflow – is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. Terminology of BPM • Method is well-considered (sophisticated) system
    [Show full text]
  • Chapter 1: Introduction
    Just Enough Structured Analysis Chapter 1: Introduction “The beginnings and endings of all human undertakings are untidy, the building of a house, the writing of a novel, the demolition of a bridge, and, eminently, the finish of a voyage.” — John Galsworthy Over the River, 1933 www.yourdon.com ©2006 Ed Yourdon - rev. 051406 In this chapter, you will learn: 1. Why systems analysis is interesting; 2. Why systems analysis is more difficult than programming; and 3. Why it is important to be familiar with systems analysis. Chances are that you groaned when you first picked up this book, seeing how heavy and thick it was. The prospect of reading such a long, technical book is enough to make anyone gloomy; fortunately, just as long journeys take place one day at a time, and ultimately one step at a time, so long books get read one chapter at a time, and ultimately one sentence at a time. 1.1 Why is systems analysis interesting? Long books are often dull; fortunately, the subject matter of this book — systems analysis — is interesting. In fact, systems analysis is more interesting than anything I know, with the possible exception of sex and some rare vintages of Australian wine. Without a doubt, it is more interesting than computer programming (not that programming is dull) because it involves studying the interactions of people, and disparate groups of people, and computers and organizations. As Tom DeMarco said in his delightful book, Structured Analysis and Systems Specification (DeMarco, 1978), [systems] analysis is frustrating, full of complex interpersonal relationships, indefinite, and difficult.
    [Show full text]
  • Formalized Structured Analysis Specifications David L
    Iowa State University Capstones, Theses and Retrospective Theses and Dissertations Dissertations 1991 Formalized structured analysis specifications David L. Coleman Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/rtd Part of the Computer Sciences Commons, and the Mathematics Commons Recommended Citation Coleman, David L., "Formalized structured analysis specifications " (1991). Retrospective Theses and Dissertations. 9634. https://lib.dr.iastate.edu/rtd/9634 This Dissertation is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Retrospective Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. INFORMATION TO USERS This manuscript has been reproduced from the microfihn master. UMI fUms the text directly &om the original or copy submitted. Thus, some thesis and dissertation copies are in typewriter face, while others may be from any type of computer printer. The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleedthrough, substandard margins, and improper alignment can adversely afifect reproduction. In the unlikely event that the author did not send UMI a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. Oversize materials (e.g., maps, drawings, charts) are reproduced by sectioning the original, beginning at the upper left-hand corner and continuing from left to right in equal sections with small overlaps.
    [Show full text]
  • & Evocean Bridging the Enterprise Architecture to IT Architecture
    & Evocean Bridging the Enterprise Architecture to IT Architecture Gap Presented by Jog Raj 31st January 2008 © Telelogic AB Agenda • Introductions • The Business Challenge • What is Enterprise Architecture • Bridging the Business and IT gap • Service Orientated Architectures • Role of Tools in Architecture • Demonstration • Questions & Answers • Summary © Telelogic AB Telelogic At A Glance • Founded 1983 • HQ Malmö, Sweden • US HQ Irvine, California • Public Company Listed in 1999 • Development Sites USA, Sweden, UK, India © Telelogic AB Global Presence Over 40 offices around the world As of September 2004 © Telelogic AB Bridging the Enterprise Architecture to IT Architecture Gap © Telelogic AB Current Business Challenges • Hypercompetitive Market – Innovation – Ability to implement ideas • Mergers and Acquisitions • Governance and Compliance • Reduce Cost – Operational costs – IT Asset Management • Reuse of assets • Application Integration Costs • Risk Reduction and Mitigation © Telelogic AB A Growing Divide? Business Challenges and Opportunities Business Process Adaptability The Internet 1990s 2000s © Telelogic AB What is Enterprise Architecture? • A description of business and IT domains: – Mission, Strategy, Landscape, Organization, People, Locations – Processes, Technology, Information, Data, Applications • A description of the relationships between them • A set of graphical and textual models and artefacts that can be communicated in a common manner • An Enterprise Architecture supports an operating business in achieving its goals
    [Show full text]