<<

IDEF0 Lessons Learned

Darold K. Smith UGS Corp, Engineering Consulting Services 4088 County Road 3326; Greenville, TX USA +1.903.883.0781 [email protected]

Copyright © 2006 by Darold K. Smith. Published and used by INCOSE with permission.

Abstract. The IDEF0 methodology is becoming widely used as a tool in all types of systems. If a program is considering using IDEF0, there are serious pitfalls to avoid to assure that using IDEF0 provides as accurate and complete definition that IDEF0 is capable of supporting. The pitfalls are illustrated with three case histories to support the lessons learned. Some lessons apply to other methodologies beyond IDEF0. Although IDEF0 has been in use for many years, it is still being applied to new programs and the lessons learned are timely.

IDEF0 Background Readers searching for a functional analysis methodology find many favorable statements about IDEF0 methodology and its use for US government projects. Some of these are excerpted below for background reference. "In 1991 the National Institute of Standards and Technology (NIST) received support from the U.S. Department of Defense, Office of Corporate Information Management (DoD/CIM), to develop one or more Federal Information Processing Standards (FIPS) for modeling techniques. The techniques selected were IDEF0 (FIPS-183 1993) for function modeling and IDEF1X (FIPS-184 1993) for information modeling.1" "IDEF0 (Integration DEFinition language 0) is based on SADT™ ( and Design Technique™), developed by Douglas T. Ross and SofTech, Inc. In its original form, IDEF0 includes both a definition of a graphical (syntax and semantics) and a description of a comprehensive methodology for developing models2". IDEF0 is the preferred functional / process modeling methodology cited in the IDEF0 standard and is the graphical diagramming tool used in the Standardization Procedures (DoD8320 1998). US Government projects, including Department of Defense (DoD), one of which is the DoD Architecture Frameworks (DoDAF), while not requiring IDEF0 for graphical modeling, use it to illustrate system views. According to the IDEF0 Standard3, "IDEF0 is a modeling technique based on combined graphics and text that are presented in an organized and systematic way to gain understanding, support analysis, provide logic for potential changes, specify , or support systems level design and integration activities. An IDEF0 model is composed of a hierarchical series of that gradually display increasing levels of detail describing functions and their interfaces within the context of a system. …"

1 IDEF0 Background, p v. 2 IDEF0 IDEF0 Approach, p vii" 3 IDEF0 3.1 Model Concepts, p 19.

1382 "IDEF0 is an engineering technique for performing and managing needs analysis, benefits analysis, requirements definition, functional analysis, systems design, maintenance, and baselines for continuous improvement. IDEF0 models provide a "blueprint" of functions and their interfaces that must be captured and understood in order to make decisions that are logical, affordable, integratable and achievable. The IDEF0 model reflects how system functions interrelate and operate just as the blueprint of a product reflects how the different pieces of a product fit together. When used in a systematic way, IDEF0 provides a systems engineering approach to: "1. Performing and design at all levels, for systems composed of people, machines, materials, computers and information of all varieties - the entire enterprise, a system, or a subject area …."

From the front matter4, "As a function modeling language, IDEF0 has the following characteristics: "1. It is a coherent and simple language, providing for rigorous and precise expression, and promoting consistency of usage and interpretation. (emphasis added)…" IDEF0 Standard Content: The IDEF0 standard contains these sections: 1. Section 3. IDEF0 Models: IDEF0 Model Syntax: Box (activity), Arrows (data), and Labeling Diagramming: Functional Decomposition, Arrow Rules, and Node Trees 2. Appendix A, IDEF0 Concepts: Progressive Decomposition and Disciplined Teamwork 3. Appendix B, User's Guide to Creating IDEF0 diagrams: Applying standards in Section 3. 4. Appendix , Review Cycle Procedures and Forms: Create Review Materials (Kit) IDEF Model Walk-Through Procedures

Tool Availability: One can create IDEF0 diagrams with any drawing package, such as Microsoft PowerPoint™, by following the syntax conventions in the standard. A search of the internet for IDEF0 Diagramming Tools produces scores of hits, ranging from basic Visio™ templates to specialized IDEF0 tools. The simpler tools, such as the Visio shapes provided by Microsoft, provide a template of shapes for creating IDEF0 diagrams but that have no methodology enforcement. Users can, through the Visio API capabilities, customize stencils that enforce the IDEF0 standards to a degree. High end specialized IDEF0 tools provide high adherence to the IDEF0 standard and provide "bundling" control of "arrows" (flow) between functions. Methodology Selection Rationale: Based on the above information, many projects select IDEF0 as the functional decomposition methodology. There is also a tendency to select free or low-cost tools because of budget constraints and no perceived need for training, particularly when program managers don't appreciate the need for producing high-quality systems engineering work products.

4 IDEF0 IDEF Approach, p vii.

1383

IDFEF0 Limitations SADT, as developed by Ross, includes two complementary methodologies, activity modeling and . IDEF0 focuses on the activity modeling. As a source of confusion, one of the few books on SADT (Marca 1988) uses “SADT” interchangeably with IDEF0 but the only passing reference to data modeling is in the preface (DRoss 1988). Thus, in spite of the statements about rigor and precision in the IDEF0 standard as described in the background information above, a number of key elements of the SADT methodology that did support rigor were stripped out in IDEF0, the most significant is data modeling. Several important concepts are not embraced in the IDEF0 standard: Data Store and . Data, in the context of this paper, is any entity that is an output or input of any function5 and is essential for properly defining or understanding the behavior of the process within the context of the system of interest. IDEF0 classifies data flows into one of four classes, Input, Control, Output, or Mechanism (ICOM) with associated rules of how they are represented and used in activity diagrams6. So what about data store and data dictionary? Data Store: Data stores provide a mechanism for temporarily or permanently storing data for future access by one or more activity. A data store represents a repository that provides the capability of future retrieval of data. Examples of data stores are paper records, computer memory (dynamic and programmable memory, disk and tape storage), , physical storage bins for manufacturing processes, energy storage devices (spring, battery, momentum, heat), etc. This lack of data store in IDEF0 as a shortcoming is cited in DoDAF V II (DoDAF VII 2003): "A unique element of a SV-4 not found in an IDEF0 activity modeling is a system data store, which is used as the source or destination (sink) of an information flow in the form of an information repository. For convenience and consistency, DATA-STORE has been incorporated in the CADM as an additional subtype of PROCESS-ACTIVITY.7" Data Dictionary: The concept of a data dictionary was popularized in 1978 by the book Structured Analysis and System Specification (DeMarco 1978). There are two types of data flows, composite and primitive8. It is essential that each composite data flow in the system be defined by its constituent composite and / or primitive data flows and each primitive data flow be unambiguously defined for the purposes intended in the system. In IDEF0, a data dictionary capability is alluded to in the topic Branching Arrows and bundling and unbundling of arrows9 but practitioners that are not familiar with the data dictionary concept are likely to not recognize the concept at all. Data Flow Balancing: Data flow balancing is the process of assuring that all data flows that enter or exit a are accounted for on the corresponding activity on the parent diagram and all data flows that enter or exit a function on the diagram are accounted for in any child diagram. For IDEF0 activity diagrams, this process includes tunneled10 data flows. Although not completely omitted in the IDEF0 standard, the only discussion about this concept

5 Function in this paper includes process and any IDEF0 activity. 6 Inputs are considered consumed or transformed by a function to produce the output(s) of the function. 7 DoDAF V II, p 5-31, CADM Support for Systems Functionality Description (SV-4). 8 Primitive data flows are those that have the lowest level of definition in the system. A primitive data flow for system functional decomposition may be left as a non-elemental definition, i.e., not sufficiently detailed to implement detail design, but sufficient to derive an unambiguous detailed design from. 9 IDEF0 3.3.2.5, p 23, Branching Arrows text and Figure 11 Arrow Fork and Join Structures.

1384 in the IDEF0 standard is in an appendix describing the review process to11 "…test the arrow interface from the parent to the child. "Criteria for acceptance: 1. There are no missing or extra interface arrows. 2. Boundary arrows are labeled with the proper ICOM codes. 3. Child arrow labels are the same or an elaboration of its parent's matching arrow. Labels convey the correct and complete arrow contents. 4. Examination of the connecting arrows reveal no problems in the parent diagram. (An added interface may create a misunderstanding of the message conveyed by the parent.)"

Case Histories Three brief case histories are presented that are the source for illustrating the IDEF0 lessons learned.

Case 1: Company X Applies IDEF0 to Business Expansion Plans Company X is behind schedule developing requirements for a new and large program. The program ventures into a new area of service business for the organization. It is critical to the company to completely define the new and unfamiliar service functions. Traditionally, Company X designs and manufactures the product and specialized spare components. The customer organization assumes responsibility for maintaining the product and all training, maintenance, logistical, and operational support for the product. The new business paradigm is for Company X to manufacture the product and instead of selling the product, provide services via lease to its customers for customer operator training, delivery to the customer operational site, and all maintenance and logistics support at the customer's operational site(s). Background: Requirements documents were received from several customers but lack operational scenarios to validate them. The program’s systems engineering management suspects the requirements are incomplete and inconsistent. The customer requirements for support services are very high level. The company needs to identify requirements that are unique for each customer to determine how to adapt the product and / or services to minimize the impact on the system design and determine the cost of the product and service so the program will be successful, i.e., profitable. The systems engineering manager was seeking an economical, quick-start, process to do functional decomposition for validating the existing requirements and identify new functional requirements necessary to adequately define the requirements for the product and the support services. In addition, the company wanted the process to be consistent with DoDAF to support company plans to embrace the DoDAF concepts in its systems engineering processes. Functional Decomposition Process: The systems engineering manager chose the IDEF0 process for functional decomposition. A "war room" was set up where specialists representing service operations, training, logistics, maintenance, transportation, etc., could develop the

10 IDEF0 3.3.2.9 Tunneled Arrows, p 29-30. Tunneling is a methodology device for skipping across levels of the functional decomposition. 11 IDEF0 C.6, p113, The IDEF Model Walk-Through Procedure, Step 3.

1385

services functional architecture. The specialists had little or no training in systems engineering methodologies and no previous IDEF0 experience. There was no training for the IDEF0 methodology because of cost and schedule constraints. The team started creating the IDEF0 diagrams using the Microsoft Visio IDEF0 shapes. The Visio drawings were printed on a plotter and taped to the walls for review, refinement, and decomposition. The program then brought a systems engineering tool on the program that enabled linking shapes on Visio diagrams with architecture objects in the tool . The goal was to migrate the existing IDEF0 drawings into the new integrated Visio environment. However, since there was no process enforcement on the original Visio diagrams, many had to be redrawn to be compatible with the new environment. The team continued the functional decomposition with whichever members were available12. There were many emotion-charged discussions about the data flows between the activities, what to name them and what information each flow represented. Upon resolution to the satisfaction of those present, the flow name was applied. Unfortunately, when a member, who was not present when a data flow name was discussed, reviewed the diagrams there was frequently significant confusion about what the flow represented and / or a different perspective introduced by the discussion. There were two ways ambiguity and inconsistency crept into the analysis: 1) different flows with the same name and 2) different names for the same information flow content. The further the system was decomposed, the more resources were consumed because of the lack of clarity of flow content. Because many of the functions being analyzed were for the service activities, nearly all activities required operating procedures of some sort – the same generic name was used for many different sets of procedures. The analysts erred by relying on the activity diagram context to imply the specific set of operating procedures to use by the names of the activities using the flow. Because there was no process for cataloging the meaning of flow names, the same flow content occasionally was given a different name at different levels in the decomposition. At this point, a consultant suggested that a data dictionary would help alleviate these problems by providing: 1. A specific name for each data definition 2. A data dictionary sorted by data definition name that can be published by export to a word processor. The document contains each definition description and its content, (either its sub flows or primitive definition). 3. The capability of predefining data elements and composite data structures (bundles) as part of the allocation process 4. The capability of defining multiple flows with a data definition when the flows carry the same data and automatically naming each data flow to match the data definition 5. Support for hierarchical composite data structures 6. Support for including the same data definition in multiple composite data structures 7. Support for finding data flows without definitions 8. The capability of creating data definitions before the need for them is identified by the functional decomposition as determined by stakeholder needs 9. The capability of adding automated data flow leveling between parent and child diagrams

12 Some of the specialists were working other projects and were from other parts of the country – thus there were both travel and availability constraints.

1386 10. The capability of generating reports to identify inconsistencies and generate metrics 11. Support for discovering needed functions to process required data 12. Support for discovering needed functions or sources of data required by functions in the system 13. Support for discovering external sources of data required by functions in the system.

The system engineering tool was tailored to support the data dictionary and provide tools to create data definitions from the existing data flow names as a one-time process. If a data flow name occurred more than once, it was automatically associated with the previously created data definition. The data flow name was linked to the data definition so that if the data definition was renamed, the data flow name would automatically update. The tool logged each data flow processed and the action taken. If there was an inconsistency or error, the error was described in the log and no change was made, leaving the corrective action for users to determine and correct manually. Because of many inconsistencies in the data that could not be automatically reconciled, the question was raised about how long it would to take to review the inconsistencies and correct them. The estimate of about a week, while modest considering the large number of hours already invested, was deemed by management to be too large an impact on both schedule and budget. The direction was to continue with the existing process and correct the situation later thereby continuing to introduce more inconsistencies. As might be expected, later has not yet arrived and the program is on hold.

Case 2: Company Y Applies IDEF0 to Improve SE Process Company Y had a contract to provide a major "technology insertion" subsystem replacement as part of a block upgrade in a military aircraft. The subsystem comprised a number of integrated lower level subsystems (hereafter called components to reduce confusion). The company was highly experienced in the technology associated with the subsystem and had already issued component specifications for many of the subsystem components based on their prior technology experience. Background: The requirements development process used by Company Y was the traditional text-based process using older DoD specification practices and a requirement tracing tool. The prime's subsystem specification was text-based in a requirements tracing tool. However, there were major gaps between the prime's subsystem specification and the requirements allocated to the subsystem components. The airframe prime was very dissatisfied with the progress achieved during requirements development and correcting requirement deficiencies for the subsystem. The prime notified the subcontractor that the subcontractor was in jeopardy – the prime was considering revoking the contract if the state of requirements was not drastically improved before the system design review scheduled for later in the year. Functional Decomposition Process: The systems engineering manager was looking for a quick alternative process to recover from the program's difficulties by the system design review date. The IDEF0 methodology was selected and Visio was the selected diagramming tool. Some engineers on the project were already familiar with Visio as a tool – many of the existing specification documents contained detailed graphics (engineering drawings) created and maintained in Visio. The systems engineering manager chose the Visio-based IDEF0 process for functional decomposition.

1387

A context diagram for the subsystem was created along with a first level decomposition into activities that represented each of the subsystem components. Several conference rooms were set up where teams of specialists for the subsystem components could project Visio IDEF0 drawings and interactively develop the activity diagrams, thus developing the component functional decompositions for many components in parallel. The specialists had little or no training in systems engineering functional decomposition methodologies and little or no previous IDEF0 experience. There was no training for the IDEF0 methodology because of cost and schedule constraints. Then a systems engineering tool was brought on board to integrate the requirements management, subsystem requirements verification, and architecture development process that included interfaces between the Visio diagrams and the systems engineering tool. The value of a data dictionary was recognized and was to be provided to help manage the functional decomposition process. Little training was provided for the IDEF0 methodology or the engineering tool – the systems engineers had to learn on the job. Many of the IDEF0 Visio drawings already created were incompatible with the SE tool interface with Visio and had to be redrawn in the SE tool Visio environment. There was much frustration with the SE tool Visio interface because of lack of training in IDEF0 methodology and lack of understanding of the SE tool capabilities. The on-site SE tool support personnel spent most of their time with training (disguised as mentoring) instead of helping focus users on identifying SE process and adapting the tools to support those processes. However, using the SE tool Visio environment, the project did succeed in eliminating the requirement gaps and allocated subsystem requirements to the IDEF0 functional architecture and data flows defined by data definitions. Inconsistencies were identified, (e.g., unallocated requirements, functions with no allocated requirements), data flows were traceable from supplier to consumer activities through the subsystem, etc. By system design review time, the system functional decomposition was completed and documented, requirements metrics were provided, subsystem documentation was generated from the SE database, supporting material for the design review was delivered on schedule, and the prime contractor was satisfied with the design review.

Case 3: Mars Climate Orbiter – Interface Units Mismatch While unknown to the author whether the IDEF0 or another methodology was employed by the Mars Climate Orbiter (MCO) program, the failure to implement or effectively apply a robust data definition methodology clearly illustrates the consequences. Extracts of press releases describe the factors leading to the loss of the Mars Climate Orbiter. “The Mars Climate Orbiter was lost at the Red Planet on September 23, 1999 because the mission's navigation team was unfamiliar with the spacecraft (Clark 1999). It lacked training, and failed to detect a mistake by outside engineers who delivered navigation information in English rather than metric units, according to a mission failure investigation report. “A litany of errors and problems led to the loss of the $125 million spacecraft. The root cause of the failure was the units mix-up between navigation teams, the real problem was a systemic failure to follow NASA procedures, said Ed Weiler, NASA's Associate Administrator for Space Science.”

1388 “The spacecraft reached Mars (NASA 1998) and executed a 16 minute 23 second orbit insertion main engine burn on 23 September 1999 at 09:01 UT (5:01 a.m. EDT) Earth received time (ERT, signal travel time from Mars will be 10 minutes 55 seconds). The spacecraft passed behind Mars at 09:06 UT ERT and was to re- emerge and establish radio contact with Earth at 09:27 UT ERT, 10 minutes after the burn was completed. However, contact was never re-established and no signal was ever received from the spacecraft. Findings of the failure review board indicate that a navigation error resulted from some spacecraft commands being Figure 1. Mars Climate Orbiter lost because of sent in English units instead of mismatch in measurement units. being converted to metric. This caused the spacecraft to miss its intended 140 - 150 km altitude above Mars during orbit insertion, instead entering the Martian atmosphere at about 57 km. The spacecraft would have been destroyed by atmospheric stresses and friction at this low altitude.” Attributing the failure to a “navigation error” is a bit of a euphemism. An abstraction from the MCO Mishap Investigation Board (MIB) report provides more insight on the MCO loss (NASA 1999). “The MCO MIB has determined that the root cause for the loss of the MCO spacecraft was the failure to use metric units in the coding of the ground file, “Small Forces”, used in the trajectory models. Specifically, thruster performance data in English units instead of metric units was used in the software application code titled SM_FORCES (small forces). A file called Angular Momentum Desaturation (AMD) contained the output data from the SM_FORCES software. The data in the AMD file was required to be in metric units per existing software interface documentation, and the trajectory modelers assumed the data was provided in metric units per the requirements. “During the 9-month journey from Earth to Mars, propulsion maneuvers were periodically performed to remove angular momentum buildup in the on-board reaction wheels (flywheels). These AMD events occurred 10-14 times more often than was expected by the operations navigation team. This was because the MCO solar array was asymmetrical relative to the spacecraft body as compared to the Mars Global Surveyor (MGS) which had symmetric solar arrays. This asymmetric effect significantly increased the Sun-induced (solar pressure-induced) momentum buildup on the spacecraft. The increased AMD events coupled with the fact that the angular momentum (impulse) data was in English, rather than metric, units, resulted in small errors being introduced in the trajectory estimate over the course of the 9-month journey. At the time of Mars insertion, the spacecraft trajectory was approximately 170 kilometers lower than planned. As a result, MCO either was destroyed in the atmosphere or re-entered heliocentric space after leaving Mars’ atmosphere.”

1389

Lessons Learned The lessons to be learned are how to address the issues of the systems engineering processes described in the cases above. The situations the engineering teams found themselves in occur frequently and are usually the result of circumstances beyond the control of the (usually) dedicated and skilled engineers thrust into a tight cost and schedule environment. In cases 1 and 2, the engineering staffs were expected to produce results from an unfamiliar methodology with unfamiliar tools with on-the-job training. In case 3, the team that provided the data in incorrect units was world class but work schedule issues were attributed to be a major factor in the error. A key to successful systems engineering is creating work products that the consumers of the product can understand and interpret correctly, i.e., the work products are clear, complete, and useful for creating, maintaining, and operating a product. Here are two statements about the critical importance of human understandability for system success: "Humans define, create, and use systems13. Therefore, to minimize opportunities for problems in a system’s lifecycle, the work products that affect the system definition and program lifecycle activities must be highly understandable and not subject to multiple interpretations (DSmith 2004). "Ultimately, Structured Analysis is a mental discipline, aided and supported by an easy to learn but challenging to master SADT methodology, for developing and applying well- structured, reliable, and carefully reasoned understanding -- in a team enterprise setting. It applies to any subject matter of interest" (DRoss 1994).

Lesson 1 Focusing on one analysis view is likely to lead to an incomplete system definition. While not explicitly stated in the first two cases, the methodology selected, "…IDEF0, (is) used to produce a ''14. "One of the most important features of IDEF0 is that it gradually introduces greater and greater levels of detail through the diagram structure comprising the model. In this way, communication is enhanced by providing the reader with a well-bounded topic with a manageable amount of new information to learn from each diagram.15” While this is all good, it is rather myopic. The technique focuses on functions and, without training in the methodology, leads users treat the data flows (interfaces) as second class citizens in architecture development. This view tends to introduce errors of omission of data that is needed in the system. The data should be analyzed and decomposed with the same rigor as the system functions. This is where an integrated dictionary is very useful16.

13 Humans may also be elements of and perform functions in a system. 14 IDEF 0, Background, p 9 15 IDEF0, A.3.2 Communication by Gradual Exposition of Detail, p 49. 16 While a data dictionary is not defined in IDEF0 or UML, using a data dictionary does not violate either methodology.

1390 A data dictionary provides another important capability, a methodical way to identify and capture data definitions that the system must process, thus helping identify functionality that is required of the system, perhaps before any diagrams are created. See the list of other data dictionary uses in Case 1.

Figure 2 illustrates part of an example data dictionary for a logical system model (functional model) where the data definitions define data flows (functional interfaces) between functions in the system. Data definitions fill the role of requirements for interfaces. The logical data in this example is fairly abstract and omits the detail required for physical architecture interfaces. At lower levels in the system, the data definitions become much finer grained. It is useful to create a separate physical data dictionary containing physical interface definitions. The functional interfaces (logical data definitions) also fill the role of high level requirements for the physical system interface definitions. Figure 2. An example of a portion of a data Figure 3 is a notional view of the dictionary with an expanded composite entry feedback paths of a truly gradual and its attributes, Description and Content. (incremental) functional decomposition process. Iteratively performing all of the activities in the figure greatly reduces the probability of errors of omission during the system definition process. See (DSmith 2004) for additional details and an overview of how to discover system interfaces needed to assure meeting stakeholder needs.

1391

Figure 3. The IDEF0 methodology identifies functions and arrows between functions and may incidentally capture functional analysis rationale.

Lesson 2 Lack of emphasis of data flow leveling between parent and child diagrams in the IDEF0 standard tends to result in this key activity being overlooked or improperly performed. This is an area for introducing inconsistencies in the system function definition. On complex systems with thousands of data definitions, some method of automatic checking is essential to assure that the proper flows or subsets of composite flows are correctly propagated. One serious disadvantage of creating any functional diagram with generic drawing packages is complete reliance on the staff that creates and reviews the drawings. Staff members must know the methodology and carefully perform iterative reviews. Simple changes like renaming a flow that is pervasive in the system (e.g., a time stamp or current time) can be difficult to change correctly on every diagram – another source of introducing inconsistencies in the system definition. Another time consuming and difficult task is to assess the effects of a proposed change. Reference links between definitions and data flows (interfaces) simplify the change assessment process. With an integrated data dictionary, changing the name of a data definition automatically propagates the change to all occurrences of the flows defined by the definition.

1392

Lesson 3 Lack of a data store concept in IDEF0 may result in users failing to recognize the need for or availability of persistent data that may be globally accessible by many functions. Lack of a capability for directly identifying data storage in a system may bias analysts' perceptions away from a more optimal functional decomposition for the system. If a work- around method of storing or retrieving data is employed, it may not be properly understood by the stakeholders, thereby resulting in not implementing the system as intended. With the prevalence of distributed information in the form of databases, web sites, and persistent storage, a method of clearly identifying and accessing persistent data is needed. The DoDAF workaround of defining a DATA-STORE as a type of PROCESS-ACTIVITY in the CADM is an awkward implementation at best. IDEF0 activities don't have an intrinsic methodology defined for behaving as data-like objects and are likely to be misinterpreted by users.

Lesson 4 To minimize project failure, introducing new process methodologies well into a program requires careful planning – staff training needs should be a major consideration. It is a pretty well recognized axiom that the time to introduce new methodologies and tools is at the beginning of a program and that doing so in the middle of a program is a recipe for disaster. However, in both IDEF0 cases described, systems engineering management could see the programs would probably fail if drastic process changes weren't implemented and better processes were the perceived best hope of the programs to survive. However, in both cases, training was bypassed on both the IDEF0 methodology and the tools used to implement it and integrate the IDEFO diagram objects with the system engineering tools. To the extent that the SE tool and Visio integration forced certain methodology constraints, the analysts are forced, by trial and error, to follow some IDEF0 processes. The capability of the SE tool to flag inconsistencies that were missed by visual inspection helps considerably.

Lesson 5 Employ a graphical analysis environment that is integrated with the system engineering database tools. A seamless interface between graphical analysis tools and the primary systems engineering database improves productivity and eliminates inconsistency between work products. For example, documents generated by SE tool(s) that contain figures generated by the graphical tool(s) will automatically be consistent as opposed to figures that are imported from an external environment.

1393

Lesson 6 The consequences of errors introduced by not properly applying a selected process or using a process that is incomplete are usually unexpected and expensive. In the case of the Mars Climate Orbiter, the publicly stated cost was $125 million dollars. Fortunately, there was no loss of human life or environmental catastrophe. However, the failed mission was a lost opportunity that had impacts on the overall success of the Mars exploration program. For the two IDEF0 cases, there is inadequate accounting data to assess the lost productivity due to lack of training. The staff members worked long hours of unpaid overtime attempting to make the programs succeed. How much loss to replace staff that abandoned the projects or the company because of the working conditions?

Summary An overview is provided of the perceived impetus to employ IDEF0 functional decomposition for defining systems. Although DoDAF uses IDEF0 diagrams to illustrate several views, it is careful not to preclude alternate methodologies. Two IDEF0 program case histories are provided for programs that recently chose to apply IDEF0 methodology, one of the perceived advantages being a step toward DoDAF compliance. The third case is related in that it illustrates the potential for consequences of errors in interfaces or errors of omission due to process methodology. Several lessons learned are summarized based on experiences on these programs. While all of the lessons learned are issues in both cases that used IDEF0, they are generally applicable to many other methodologies, particularly the usefulness of an integrated comprehensive data dictionary and an overall systems engineering tool(s) that puts as much emphasis on the system interfaces as on the system architectures.

References DeMarco 1978, Structured Analysis and System Design, T. DeMarco, 1978, Yourdon Press, ISBN 0-917072-07-3. Clark 1999, Navigation Team Was Unfamiliar with Mars Climate Orbiter, http://www.space.com/news/mco_report-b_991110.html, Nov 11, 1999. DoDAF VII-2003, DoD Architecture Framework Version 1.0, Volume II: Product Descriptions, August 2003. DoD8320 1998, Data Standardization Procedures. DoD 8320-1-M1, Defense Technical Information Center (DTIC), Fort Belvoir, VA. DRoss 1988, Forward, (Marca 1988), p xiv. DRoss 1994, Must Technology's Future Be a Product of its Past?, http://www.swiss.au.mit.edu/projects/studentaut/Background_introduction.html. DSmith 2004, Human Understanding for Optimizing Architectures, D. K. Smith, INCOSE Symposium Proceedings, 2004. FIPS-183 1993, Integration Definition for Function Modeling (IDEF0). National Institute of Standards and Technology (NIST), Publication 183, 1993. FIPS-184 1993, Integration Definition for Information Modeling (IDEF1X). National Institute of Standards and Technology (NIST), Publication 184, 1993.

1394 Marca 1988, Structured Analysis and Design Technique, McGraw Hill, Inc., NASA 1998, Mars Climate Obiter, http://nssdc.gsfc.nasa.gov/nmc/tmp/1998-073A.html NASA 1999, Mars Climate Orbiter Mishap Investigation Board Phase I Report – Executive Summary, ftp://ftp.hq.nasa.gov/pub/pao/reports/1999/MCO_report.pdf, November 10, 1999.

BIOGRAPHY Mr. Smith is a registered Professional Engineer in Texas in two areas, Electrical Engineering and and is an INCOSE CSEP. With an extensive background in Systems Engineering, he assists projects with establishing / tailoring systems engineering processes that encompass Product Lifecycle Management (PLM). Previous positions include: Raytheon Systems Company (Sr. SE), Texas Instruments Defense Systems Group (Member, Group Technical Staff, SE), Motorola Government Electronics Group (Principal Staff EE), General Dynamics Electronics (Sr. Design Engineer). Disciplines include: Systems Engineering, , Software Engineering, CASE tool development, Knowledge Engineering, Rapid Prototyping, AI applications, Support Engineering, electronic equipment design, and aerospace electrical systems design.

1395