I Federal Information Processing Standards Publication 184 1993

Total Page:16

File Type:pdf, Size:1020Kb

I Federal Information Processing Standards Publication 184 1993 Federal Information Processing Standards Publication 184 1993 December 21 Announcing the Standard for INTEGRATION DEFINITION FOR INFORMATION MODELING (IDEF1X) Federal Information Processing Standards Publications (FIPS PUBS) are issued by the National Institute of Standards and Technology after approval by the Secretary of Commerce pursuant to Section 111(d) of the Federal Property and Administrative Services Act of 1949 as amended by the Computer Security Act of 1987, Public Law 100-235. 1. Name of Standard. Integration Definition for Information Modeling (IDEF1X). 2. Category of Standard. Software Standard, Modeling Techniques. 3. Explanation. This publication announces the adoption of the Integration Definition for Information Modeling (IDEF1X) as a Federal Information Processing Standard (FIPS). This standard is based on the Integration Information Support System (IISS), Volume V - Common Data Model Subsystem, Part 4 - Information Modeling Manual - IDEF1 Extended, 1 (IDEF1X) November 1985. This standard describes the IDEF1X modeling language (semantics and syntax) and associated rules and techniques, for developing a logical model of data. IDEF1X is used to produce a graphical information model which represents the structure and semantics of information within an environment or system. Use of this standard permits the construction of semantic data models which may serve to support the management of data as a resource, the integration of information systems, and the building of computer databases. This standard is the reference authority for use by information modelers required to utilize the IDEF1X modeling technique, implementors in developing tools for implementing this technique, and other computer professionals in understanding the precise syntactic and semantic rules of the standard. 4. Approving Authority. Secretary of Commerce. 5. Maintenance Agency. Department of Commerce, National Institute of Standards and Technology, Computer Systems Laboratory. 6. Cross Index. a. Integration Information Support System (IISS), Volume V - Common Data Model Subsystem, Part 4 - Information Modeling Manual - IDEF1 Extended. 7. Related Documents. a. Federal Information Resources Management Regulations Subpart 201.20.303, Standards, and Subpart 201.39.1002, Federal Standards. b. ICAM Architecture Part II-Volume V - Information Modeling Manual (IDEF1), AFWAL-TR-81-4023, Materials Laboratory, Air Force Wright Aeronautical Laboratories, Air Force Systems Command, Wright-Patterson Air Force Base, Ohio 45433, June 1981. c. ICAM Architecture Part II-Volume IV - Function Modeling Manual (IDEF0), AFWAL- TR-81-4023, Materials Laboratory, Air Force Wright Aeronautical Laboratories, Air Force Systems Command, Wright-Patterson Air Force Base, Ohio 45433, June 1981. d. ICAM Configuration Management, Volume II -ICAM Documentation Standards for Systems Development Methodology (SDM), AFWAL-TR-82-4157, Air Force Systems Command, Wright- Patterson Air Force Base, Ohio 45433, October 1983. 8. Objectives. The primary objectives of this standard are: i a. To provide a means for completely understanding and analyzing an organization's data resources; b. To provide a common means of representing and communicating the complexity of data; c. To provide a technique for presenting an overall view of the data required to run an enterprise; d. To provide a means for defining an application- independent view of data which can be validated by users and transformed into a physical database design; e. To provide a technique for deriving an integrated data definition from existing data resources. 9. Applicability. An information modeling technique is used to model data in a standard, consistent, predictable manner in order to manage it as a resource. The use of this standard is strongly recommended for all projects requiring a standard means of defining and analyzing the data resources within an organization. Such projects include: a. incorporating a data modeling technique into a methodology; b. using a data modeling technique to manage data as a resource; c. using a data modeling technique for the integration of information systems; d. using a data modeling technique for designing computer databases. The specifications of this standard are applicable when a data modeling technique is applied to the following: a. projects requiring IDEF1X as the modeling technique; b. development of automated software tools implementing the IDEF1X modeling technique. The specification of this standard are not applicable to those projects requiring data modeling technique other than IDEF1X. Nonstandard features of the IDEF1X technique should be used only when the needed operation or function cannot reasonably be implemented with the standard features alone. Although nonstandard features can be very useful, it should be recognized that the use of these or any other nonstandard elements may make the integration of data models more difficult and costly. 10. Specifications. This standard adopts the Integration Definition Method for Information Modeling (IDEF1X) as a Federal Information Processing Standard (FIPS). 11. Implementation. The implementation of this standard involves two areas of consideration: acquisition of implementations and interpretation of the standard. 11.1 Acquisition of IDEF1X Implementations. This publication (FIPS 184) is effective June 30, 1994. Projects utilizing the IDEF1X data modeling technique, or software implementing the IDEF1X data modeling technique, acquired for Federal use after this date should conform to FIPS 184. Conformance to this standard should be considered whether the project utilizing the IDEF1X data modeling technique is acquired as part of an ADP system procurement, acquired by separate procurement, used under an ADP leasing arrangement, or specified for use in contracts for programming services. A transition period provides time for industry to develop products conforming to this standard. The transition period begins on the effective date and continues for one (1) year thereafter. The provisions of this publication apply to orders placed after the date of this publication; however, utilizing an IDEF1X information modeling technique that does not conform to this standard may be permitted during the transition period. ii 11.2 Interpretation of this FIPS. NIST provides for the resolution of questions regarding the implementation and applicability of this FIPS. All questions concerning the interpretation of IDEF1X should be addressed to: Director, Computer Systems Laboratory ATTN: FIPS IDEF1X Interpretation National Institute of Standards and Technology Gaithersburg, MD 20899 12. Waivers. Under certain exceptional circumstances, the heads of Federal departments and agencies may approve waivers to Federal Information Processing Standards (FIPS). The head of such agencies may redelegate such authority only to a senior official designated pursuant to section 3506(b) of Title 44, United States Code. Requests for waivers shall be granted only when: a. Compliance with a standard would adversely affect the accomplishment of the mission of an operator of a Federal computer system, or b. Compliance with a standard would cause a major adverse financial impact on the operator which is not offset by government-wide savings. Agency heads may approve requests for waivers only by a written decision which explains the basis upon which the agency head made the required finding(s). A copy of each such decision, with procurement sensitive or classified portions clearly identified, shall be sent to: Director, Computer Systems Laboratory, ATTN: FIPS Waiver Decisions, Technology Building, Room B-154, National Institute of Standards and Technology, Gaithersburg, MD 20899. In addition, notice of each waiver granted and each delegation of authority to approve waivers shall be sent promptly to the Committee on Government Operations of the House of Representatives and the Committee on Government Affairs of the Senate and shall be published promptly in the Federal Register. When the determination on a waiver request applies to the procurement of equipment and/or services, a notice of the waiver determination must be published in the Commerce Business Daily as a part of the notice of solicitation for offers of an acquisition or, if the waiver determination is made after that notice is published, by amendment of such notice. A copy of the waiver request, any supporting documents, the document approving the waiver request and any supporting and accompanying documents, with such deletions as the agency is authorized and decides to make under 5 U.S.C. Sec. 552 (b), shall be part of the procurement documentation and retained by the agency. 13. Where to Obtain Copies. Copies of this publication are for sale by the National Technical Information Service, U.S. Department of Commerce, Springfield, VA 22161. When ordering, refer to Federal Information Processing Standards Publication 184 (FIPSPUB 184) and title. Payment may be made by check, money order, or deposit account. iii Background: The need for semantic data models was first recognized by the U.S. Air Force in the mid-seventies as a result of the Integrated Computer Aided Manufacturing (ICAM) Program. The objective of this program was to increase manufacturing productivity through the systematic application of computer technology. The ICAM Program identified a need for better analysis and communication techniques for people involved in improving manufacturing productivity. As a result, the ICAM Program developed a series
Recommended publications
  • Data Warehouse: an Integrated Decision Support Database Whose Content Is Derived from the Various Operational Databases
    1 www.onlineeducation.bharatsevaksamaj.net www.bssskillmission.in DATABASE MANAGEMENT Topic Objective: At the end of this topic student will be able to: Understand the Contrasting basic concepts Understand the Database Server and Database Specified Understand the USER Clause Definition/Overview: Data: Stored representations of objects and events that have meaning and importance in the users environment. Information: Data that have been processed in such a way that they can increase the knowledge of the person who uses it. Metadata: Data that describes the properties or characteristics of end-user data and the context of that data. Database application: An application program (or set of related programs) that is used to perform a series of database activities (create, read, update, and delete) on behalf of database users. WWW.BSSVE.IN Data warehouse: An integrated decision support database whose content is derived from the various operational databases. Constraint: A rule that cannot be violated by database users. Database: An organized collection of logically related data. Entity: A person, place, object, event, or concept in the user environment about which the organization wishes to maintain data. Database management system: A software system that is used to create, maintain, and provide controlled access to user databases. www.bsscommunitycollege.in www.bssnewgeneration.in www.bsslifeskillscollege.in 2 www.onlineeducation.bharatsevaksamaj.net www.bssskillmission.in Data dependence; data independence: With data dependence, data descriptions are included with the application programs that use the data, while with data independence the data descriptions are separated from the application programs. Data warehouse; data mining: A data warehouse is an integrated decision support database, while data mining (described in the topic introduction) is the process of extracting useful information from databases.
    [Show full text]
  • Infrastructure) IMS > J
    NE DO-IT-O 0 16 <i#^^0%'IW#^#^#(Hyper-Intellectual-IT Infrastructure) IMS > J TO 1 3 ¥ 3 M NEDD H»* r— • ^ ’V^^ m) 010018981-0 (Hyper-Intellectual-IT fr9, (Hyper IT) i-6Z 6 & g 1% ^ L/bo i2 NEDO-IT-0016 < (Hyper-Intel lectual-IT Infrastructure) 9H3E>J 1 3 # 3 ^ lT5fe (%) B^f-r^'a'W^ZPJf (Summary) -f — t 7-j'<7)7*0- K/O- % -y f-7-? ±K*EL/--:#<<7)->5a.v-j'^T*-j'^-7OTSmiLr1 Eft • Skit • tWs§ (Otis ti® o T S T V' £ „ Ltf' U - 7- L*{Hffi,»ftffiIBIS<7)'> 5 a. V- -> 3 >^x- ? ^-Xfijfflic-7V>TI±#<<7)*®»:C0E@»S»6L, -en<b»sili*5tL^itn(i\ *7h Ty — t’ 4-^hLfcE&tt>fc1SSSeF% • Eft • KS& k" f> $ $ & b ttv>, (1) 3>ti- (2) f -^ (3) $-y t-7-7 (4) ->i ( 5 ) 7 7 t 73H7ft#-9— fxwilttas Sti:, ±EP$t t k ic, KSUWSSiaBF^ ■ Elt ■ # ## k T-<7)FB1«6 4-$v>tti u iirn *iiM LrtlSS-S k a6* 0 (1) Hyper-IT 4 7 —-y OEE$l&teH k $l!$ (2 ) Hyper-IT -f7-y*k##<7)tbK (3) KS<7)i5tv^k (4) #@<7)SEE • IISE<7)#S (5) Hwif^silftftkn-KvyT ’tt Summary In recently years, we have broad band network and The Internet environment , using these infrastructure, we will develop knowledge co-operate manufacturing support system, which can be use various kinds of simulators and databases. But knowledge co-operate manufacturing support system has a lot of problems, such as data format, legal problems, software support system and so no.
    [Show full text]
  • Data Model Standards and Guidelines, Registration Policies And
    Data Model Standards and Guidelines, Registration Policies and Procedures Version 3.2 ● 6/02/2017 Data Model Standards and Guidelines, Registration Policies and Procedures Document Version Control Document Version Control VERSION D ATE AUTHOR DESCRIPTION DRAFT 03/28/07 Venkatesh Kadadasu Baseline Draft Document 0.1 05/04/2007 Venkatesh Kadadasu Sections 1.1, 1.2, 1.3, 1.4 revised 0.2 05/07/2007 Venkatesh Kadadasu Sections 1.4, 2.0, 2.2, 2.2.1, 3.1, 3.2, 3.2.1, 3.2.2 revised 0.3 05/24/07 Venkatesh Kadadasu Incorporated feedback from Uli 0.4 5/31/2007 Venkatesh Kadadasu Incorporated Steve’s feedback: Section 1.5 Issues -Change Decide to Decision Section 2.2.5 Coordinate with Kumar and Lisa to determine the class words used by XML community, and identify them in the document. (This was discussed previously.) Data Standardization - We have discussed on several occasions the cross-walk table between tabular naming standards and XML. When did it get dropped? Section 2.3.2 Conceptual data model level of detail: changed (S) No foreign key attributes may be entered in the conceptual data model. To (S) No attributes may be entered in the conceptual data model. 0.5 6/4/2007 Steve Horn Move last paragraph of Section 2.0 to section 2.1.4 Data Standardization Added definitions of key terms 0.6 6/5/2007 Ulrike Nasshan Section 2.2.5 Coordinate with Kumar and Lisa to determine the class words used by XML community, and identify them in the document.
    [Show full text]
  • Best Practices in Business Instruction. INSTITUTION Delta Pi Epsilon Society, Little Rock, AR
    DOCUMENT RESUME ED 477 251 CE 085 038 AUTHOR Briggs, Dianna, Ed. TITLE Best Practices in Business Instruction. INSTITUTION Delta Pi Epsilon Society, Little Rock, AR. PUB DATE 2001-00-00 NOTE 97p. AVAILABLE FROM Delta Pi Epsilon, P.O. Box 4340, Little Rock, AR 72214 ($15). Web site: http://www.dpe.org/ . PUB TYPE Collected Works General (020) Guides Classroom Teacher (052) EDRS PRICE EDRS Price MF01/PC04 Plus Postage. DESCRIPTORS Accounting; *Business Education; Career Education; *Classroom Techniques; Computer Literacy; Computer Uses in Education; *Educational Practices; *Educational Strategies; Group Instruction; Keyboarding (Data Entry); *Learning Activities; Postsecondary Education; Secondary Education; Skill Development; *Teaching Methods; Technology Education; Vocational Adjustment; Web Based Instruction IDENTIFIERS *Best Practices; Electronic Commerce; Intranets ABSTRACT This document is intended to give business teachers a few best practice ideas. Section 1 presents an overview of best practice and a chart detailing the instructional levels, curricular areas, and main competencies addressed in the 26 papers in Section 2. The titles and authors of the papers included in Section 2 are as follows: "A Software Tool to Generate Realistic Business Data for Teaching" (Catherine S. Chen); "Alternatives to Traditional Assessment of Student Learning" (Nancy Csapo); "Applying the Principles of Developmental Learning to Accounting Instruction" (Burt Kaliski); "Collaborative Teamwork in the Classroom" (Shelia Tucker); "Communicating Statistics Measures of Central Tendency" (Carol Blaszczynski); "Creating a Global Business Plan for Exporting" (Les Dlabay); "Creating a Supportive Learning Environment" (Rose Chinn); "Developing Job Survival Skills"(R. Neil Dortch); "Engaging Students in Personal Finance and Career Awareness Instruction: 'Welcome to the Real World!'" (Thomas Haynes); "Enticing Students to Prepare for and to Stay 'Engaged' during Class Presentations/Discussions" (Zane K.
    [Show full text]
  • Integration Definition for Function Modeling (IDEF0)
    NIST U.S. DEPARTMENT OF COMMERCE PUBLICATIONS £ Technology Administration National Institute of Standards and Technology FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) » Category: Software Standard SUBCATEGORY: MODELING TECHNIQUES 1993 December 21 183 PUB FIPS JK- 45C .AS A3 //I S3 IS 93 FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) Category: Software Standard Subcategory: Modeling Techniques Computer Systems Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 Issued December 21, 1993 U.S. Department of Commerce Ronald H. Brown, Secretary Technology Administration Mary L. Good, Under Secretary for Technology National Institute of Standards and Technology Arati Prabhakar, Director Foreword The Federal Information Processing Standards Publication Series of the National Institute of Standards and Technology (NIST) is the official publication relating to standards and guidelines adopted and promulgated under the provisions of Section 111 (d) of the Federal Property and Administrative Services Act of 1949 as amended by the Computer Security Act of 1987, Public Law 100-235. These mandates have given the Secretary of Commerce and NIST important responsibilities for improving the utilization and management of computer and related telecommunications systems in the Federal Government. The NIST, through its Computer Systems Laboratory, provides leadership, technical guidance,
    [Show full text]
  • Chapter 2: Database System Concepts and Architecture Define
    Chapter 2: Database System Concepts and Architecture define: data model - set of concepts that can be used to describe the structure of a database data types, relationships and constraints set of basic operations - retrievals and updates specify behavior - set of valid user-defined operations categories: high-level (conceptual data model) - provides concepts the way a user perceives data - entity - real world object or concept to be represented in db - attribute - some property of the entity - relationship - represents and interaction among entities representational (implementation data model) - hide some details of how data is stored, but can be implemented directly - record-based models like relational are representational low-level (physical data model) - provides details of how data is stored - record formats - record orderings - access path (for efficient search) schemas and instances: database schema - description of the data (meta-data) defined at design time each object in schema is a schema construct EX: look at TOY example - top notation represents schema schema constructs: cust ID; order #; etc. database state - the data in the database at any particular time - also called set of instances an instance of data is filled when database is populated/updated EX: cust name is a schema construct; George Grant is an instance of cust name difference between schema and state - at design time, schema is defined and state is the empty state - state changes each time data is inserted or updated, schema remains the same Three-schema architecture
    [Show full text]
  • Fielder Elected
    • ——— •—- •^ »!• m«g -mmiMMa&rr »• t-^ •iwiiiiiiiwum I*T The Leading and Most Widely Circulated Weekly Newspaper in Union County WESTFIELD, NEW JERSEY, WEDNESDAY, NOVEMBEB, 5, 1913, FOURTEEN PAGKS—2 CENTS NOVEMBER 5, 1913 Money deposited in our Savings Department on or before the above date, will draw interest at 4 per cent, from NOVEMBER FIRST. Check Accounts—largo or small- ? A FEW DID The Two Clark's ore Re-elected in Westfield with Many received on liberal terms. Votes to the Good—Casey Polls Big Vote in the COUNTY DEMOGRATiO ASSETS OVER $1,000,000.00 T l/OTE YESTERDAY Fourth and is Returned to the Council ASSEMBLYCANDIDATES The Early Returns Showed Traynor Running Strong for The Oldest Banking Institution in Westfield sgistsrcd Men Were Goif- Assessor, but Denman Receives Majority Elected by About Six Hundred inp or Motoring in ' of Over Two Hundred Votes Majority and Run Far Other Placos Behind Ticket MITCHELL AND ENTIRE FOSION TICKET WINS IN NEW YORE VOTES THE TOTAL CAST MAYOR EVANS WAS DEFEATED Fielder Elected "v ivory 1.308 volt's uasi in ilnyir Kvans proved his jni|m- !<i ui tiis vlcctinn nnii 17 litritv in Westiiold by luimiup •IN liii' noxt. Governor ul" New di'i-scy. 1 rejected. There were 'J2S iiliciid of l'.'.s uriiiTSl opjuMi- voters in Wcsl- cnt but imt'oHuiiiiU'ly went down Kledcd becmise of failhfu! and cfllcionl Rcrvice it tiiis t lection which prows with liis tiok«t fur tin1 Di'inncrntic in tile past. i did not avail Uiomselvcs AsKi'inlily in llio county won out right or sufTrap-d.
    [Show full text]
  • Using Telelogic DOORS and Microsoft Visio to Model and Visualize Complex Business Processes
    Using Telelogic DOORS and Microsoft Visio to Model and Visualize Complex Business Processes “The Business Driven Application Lifecycle” Bob Sherman Procter & Gamble Pharmaceuticals [email protected] Michael Sutherland Galactic Solutions Group, LLC [email protected] Prepared for the Telelogic 2005 User Group Conference, Americas & Asia/Pacific http://www.telelogic.com/news/usergroup/us2005/index.cfm 24 October 2005 Abstract: The fact that most Information Technology (IT) projects fail as a result of requirements management problems is common knowledge. What is not commonly recognized is that the widely haled “use case” and Object Oriented Analysis and Design (OOAD) phenomenon have resulted in little (if any) abatement of IT project failures. In fact, ten years after the advent of these methods, every major IT industry research group remains aligned on the fact that these projects are still failing at an alarming rate (less than a 30% success rate). Ironically, the popularity of use case and OOAD (e.g. UML) methods may be doing more harm than good by diverting our attention away from addressing the real root cause of IT project failures (when you have a new hammer, everything looks like a nail). This paper asserts that, the real root cause of IT project failures centers around the failure to map requirements to an accurate, precise, comprehensive, optimized business model. This argument will be supported by a using a brief recap of the history of use case and OOAD methods to identify differences between the problems these methods were intended to address and the challenges of today’s IT projects.
    [Show full text]
  • Data Models for Home Services
    __________________________________________PROCEEDING OF THE 13TH CONFERENCE OF FRUCT ASSOCIATION Data Models for Home Services Vadym Kramar, Markku Korhonen, Yury Sergeev Oulu University of Applied Sciences, School of Engineering Raahe, Finland {vadym.kramar, markku.korhonen, yury.sergeev}@oamk.fi Abstract An ultimate penetration of communication technologies allowing web access has enriched a conception of smart homes with new paradigms of home services. Modern home services range far beyond such notions as Home Automation or Use of Internet. The services expose their ubiquitous nature by being integrated into smart environments, and provisioned through a variety of end-user devices. Computational intelligence require a use of knowledge technologies, and within a given domain, such requirement as a compliance with modern web architecture is essential. This is where Semantic Web technologies excel. A given work presents an overview of important terms, vocabularies, and data models that may be utilised in data and knowledge engineering with respect to home services. Index Terms: Context, Data engineering, Data models, Knowledge engineering, Semantic Web, Smart homes, Ubiquitous computing. I. INTRODUCTION In recent years, a use of Semantic Web technologies to build a giant information space has shown certain benefits. Rapid development of Web 3.0 and a use of its principle in web applications is the best evidence of such benefits. A traditional database design in still and will be widely used in web applications. One of the most important reason for that is a vast number of databases developed over years and used in a variety of applications varying from simple web services to enterprise portals. In accordance to Forrester Research though a growing number of document, or knowledge bases, such as NoSQL is not a hype anymore [1].
    [Show full text]
  • Modelling, Analysis and Design of Computer Integrated Manueactur1ng Systems
    MODELLING, ANALYSIS AND DESIGN OF COMPUTER INTEGRATED MANUEACTUR1NG SYSTEMS Volume I of II ABDULRAHMAN MUSLLABAB ABDULLAH AL-AILMARJ October-1998 A thesis submitted for the DEGREE OP DOCTOR OF.PHILOSOPHY MECHANICAL ENGINEERING DEPARTMENT, THE UNIVERSITY OF SHEFFIELD 3n ti]S 5íamc of Allai]. ¿Hoot (gractouo. iHHoßt ¿Merciful. ACKNOWLEDGEMENTS I would like to express my appreciation and thanks to my supervisor Professor Keith Ridgway for devoting freely of his time to read, discuss, and guide this research, and for his assistance in selecting the research topic, obtaining special reference materials, and contacting industrial collaborations. His advice has been much appreciated and I am very grateful. I would like to thank Mr Bruce Lake at Brook Hansen Motors who has patiently answered my questions during the case study. Finally, I would like to thank my family for their constant understanding, support and patience. l To my parents, my wife and my son. ABSTRACT In the present climate of global competition, manufacturing organisations consider and seek strategies, means and tools to assist them to stay competitive. Computer Integrated Manufacturing (CIM) offers a number of potential opportunities for improving manufacturing systems. However, a number of researchers have reported the difficulties which arise during the analysis, design and implementation of CIM due to a lack of effective modelling methodologies and techniques and the complexity of the systems. The work reported in this thesis is related to the development of an integrated modelling method to support the analysis and design of advanced manufacturing systems. A survey of various modelling methods and techniques is carried out. The methods SSADM, IDEFO, IDEF1X, IDEF3, IDEF4, OOM, SADT, GRAI, PN, 10A MERISE, GIM and SIMULATION are reviewed.
    [Show full text]
  • Principles of the Concept-Oriented Data Model Alexandr Savinov
    Principles of the Concept-Oriented Data Model Alexandr Savinov Institute of Mathematics and Computer Science Academy of Sciences of Moldova Academiei 5, MD-2028 Chisinau, Moldova Fraunhofer Institute for Autonomous Intelligent System Schloss Birlinghoven, 53754 Sankt Augustin, Germany http://www.conceptoriented.com [email protected] Principles of the Concept-Oriented Data Model Alexandr Savinov Institute of Mathematics and Computer Science, Academy of Sciences of Moldova Academiei 5, MD-2028 Chisinau, Moldova Fraunhofer Institute for Autonomous Intelligent System Schloss Birlinghoven, 53754 Sankt Augustin, Germany http://www.conceptoriented.com [email protected] In the paper a new approach to data representation and manipulation is described, which is called the concept-oriented data model (CODM). It is supposed that items represent data units, which are stored in concepts. A concept is a combination of superconcepts, which determine the concept’s dimensionality or properties. An item is a combination of superitems taken by one from all the superconcepts. An item stores a combination of references to its superitems. The references implement inclusion relation or attribute- value relation among items. A concept-oriented database is defined by its concept structure called syntax or schema and its item structure called semantics. The model defines formal transformations of syntax and semantics including the canonical semantics where all concepts are merged and the data semantics is represented by one set of items. The concept-oriented data model treats relations as subconcepts where items are instances of the relations. Multi-valued attributes are defined via subconcepts as a view on the database semantics rather than as a built-in mechanism.
    [Show full text]
  • Metamodeling the Enhanced Entity-Relationship Model
    Metamodeling the Enhanced Entity-Relationship Model Robson N. Fidalgo1, Edson Alves1, Sergio España2, Jaelson Castro1, Oscar Pastor2 1 Center for Informatics, Federal University of Pernambuco, Recife(PE), Brazil {rdnf, eas4, jbc}@cin.ufpe.br 2 Centro de Investigación ProS, Universitat Politècnica de València, València, España {sergio.espana,opastor}@pros.upv.es Abstract. A metamodel provides an abstract syntax to distinguish between valid and invalid models. That is, a metamodel is as useful for a modeling language as a grammar is for a programming language. In this context, although the Enhanced Entity-Relationship (EER) Model is the ”de facto” standard modeling language for database conceptual design, to the best of our knowledge, there are only two proposals of EER metamodels, which do not provide a full support to Chen’s notation. Furthermore, neither a discussion about the engineering used for specifying these metamodels is presented nor a comparative analysis among them is made. With the aim at overcoming these drawbacks, we show a detailed and practical view of how to formalize the EER Model by means of a metamodel that (i) covers all elements of the Chen’s notation, (ii) defines well-formedness rules needed for creating syntactically correct EER schemas, and (iii) can be used as a starting point to create Computer Aided Software Engineering (CASE) tools for EER modeling, interchange metadata among these tools, perform automatic SQL/DDL code generation, and/or extend (or reuse part of) the EER Model. In order to show the feasibility, expressiveness, and usefulness of our metamodel (named EERMM), we have developed a CASE tool (named EERCASE), which has been tested with a practical example that covers all EER constructors, confirming that our metamodel is feasible, useful, more expressive than related ones and correctly defined.
    [Show full text]