PRINCIPLES of SOFTWARE ENGINEERING SECTION-C (NOTES) SEM-6TH ARCHITECTURAL DESIGN: Software Architecture Serves As the Blueprin
Total Page:16
File Type:pdf, Size:1020Kb

Load more
Recommended publications
-
Development Team Principal Investigator Prof
Paper No: 6 Remote Sensing & GIS Applications in Environmental Sciences Module: 22 Hierarchical, network and relational data Development Team Principal Investigator Prof. R.K. Kohli & Prof. V.K. Garg & Prof. Ashok Dhawan Co- Principal Investigator Central University of Punjab, Bathinda Dr. Puneeta Pandey Paper Coordinator Centre for Environmental Sciences and Technology Central University of Punjab, Bathinda Dr Dinesh Kumar, Content Writer Department of Environmental Sciences Central University of Jammu Content Reviewer Dr. Puneeta Pandey Central University of Punjab, Bathinda Anchor Institute Central University of Punjab 1 Remote Sensing & GIS Applications in Environmental Sciences Environmental Hierarchical, network and relational data Sciences Description of Module/- Subject Name Environmental Sciences Paper Name Remote Sensing & GIS Applications in Environmental Sciences Module Name/Title Hierarchical, network and relational data Module Id EVS/RSGIS-EVS/22 Pre-requisites Introductory knowledge of computers, basic mathematics and GIS Objectives To understand the concept of DBMS and database Models Keywords Fuzzy Logic, Decision Tree, Vegetation Indices, Hyper-spectral, Multispectral 2 Remote Sensing & GIS Applications in Environmental Sciences Environmental Hierarchical, network and relational data Sciences Module 29: Hierarchical, network and relational data 1. Learning Objective The objective of this module is to understand the concept of Database Management System (DBMS) and database Models. The present module explains in details about the relevant data models used in GIS platform along with their pros and cons. 2. Introduction GIS is a very powerful tool for a range of applications across the disciplines. The capacity of GIS tool to store, retrieve, analyze and model the information comes from its database management system (DBMS). -
Data Warehouse: an Integrated Decision Support Database Whose Content Is Derived from the Various Operational Databases
1 www.onlineeducation.bharatsevaksamaj.net www.bssskillmission.in DATABASE MANAGEMENT Topic Objective: At the end of this topic student will be able to: Understand the Contrasting basic concepts Understand the Database Server and Database Specified Understand the USER Clause Definition/Overview: Data: Stored representations of objects and events that have meaning and importance in the users environment. Information: Data that have been processed in such a way that they can increase the knowledge of the person who uses it. Metadata: Data that describes the properties or characteristics of end-user data and the context of that data. Database application: An application program (or set of related programs) that is used to perform a series of database activities (create, read, update, and delete) on behalf of database users. WWW.BSSVE.IN Data warehouse: An integrated decision support database whose content is derived from the various operational databases. Constraint: A rule that cannot be violated by database users. Database: An organized collection of logically related data. Entity: A person, place, object, event, or concept in the user environment about which the organization wishes to maintain data. Database management system: A software system that is used to create, maintain, and provide controlled access to user databases. www.bsscommunitycollege.in www.bssnewgeneration.in www.bsslifeskillscollege.in 2 www.onlineeducation.bharatsevaksamaj.net www.bssskillmission.in Data dependence; data independence: With data dependence, data descriptions are included with the application programs that use the data, while with data independence the data descriptions are separated from the application programs. Data warehouse; data mining: A data warehouse is an integrated decision support database, while data mining (described in the topic introduction) is the process of extracting useful information from databases. -
Powerdesigner 16.6 Data Modeling
SAP® PowerDesigner® Document Version: 16.6 – 2016-02-22 Data Modeling Content 1 Building Data Models ...........................................................8 1.1 Getting Started with Data Modeling...................................................8 Conceptual Data Models........................................................8 Logical Data Models...........................................................9 Physical Data Models..........................................................9 Creating a Data Model.........................................................10 Customizing your Modeling Environment........................................... 15 1.2 Conceptual and Logical Diagrams...................................................26 Supported CDM/LDM Notations.................................................27 Conceptual Diagrams.........................................................31 Logical Diagrams............................................................43 Data Items (CDM)............................................................47 Entities (CDM/LDM)..........................................................49 Attributes (CDM/LDM)........................................................55 Identifiers (CDM/LDM)........................................................58 Relationships (CDM/LDM)..................................................... 59 Associations and Association Links (CDM)..........................................70 Inheritances (CDM/LDM)......................................................77 1.3 Physical Diagrams..............................................................82 -
Integration Definition for Function Modeling (IDEF0)
NIST U.S. DEPARTMENT OF COMMERCE PUBLICATIONS £ Technology Administration National Institute of Standards and Technology FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) » Category: Software Standard SUBCATEGORY: MODELING TECHNIQUES 1993 December 21 183 PUB FIPS JK- 45C .AS A3 //I S3 IS 93 FIPS PUB 183 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION INTEGRATION DEFINITION FOR FUNCTION MODELING (IDEFO) Category: Software Standard Subcategory: Modeling Techniques Computer Systems Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 Issued December 21, 1993 U.S. Department of Commerce Ronald H. Brown, Secretary Technology Administration Mary L. Good, Under Secretary for Technology National Institute of Standards and Technology Arati Prabhakar, Director Foreword The Federal Information Processing Standards Publication Series of the National Institute of Standards and Technology (NIST) is the official publication relating to standards and guidelines adopted and promulgated under the provisions of Section 111 (d) of the Federal Property and Administrative Services Act of 1949 as amended by the Computer Security Act of 1987, Public Law 100-235. These mandates have given the Secretary of Commerce and NIST important responsibilities for improving the utilization and management of computer and related telecommunications systems in the Federal Government. The NIST, through its Computer Systems Laboratory, provides leadership, technical guidance, -
A Pattern-Based Approach to Ontology-Supported Semantic Data Integration
A pattern-based approach to ontology-supported semantic data integration Andreas A. Jordan Wolfgang Mayer Georg Grossmann Markus Stumptner Advanced Computing Research Centre University of South Australia, Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095, Email: [email protected] Email: (wolfgang.mayer|georg.grossmann)@unisa.edu.au, [email protected] Abstract RSM2, ISO 159263 and Gellish4 standards are all used to represent information about engineering assets and Interoperability between information systems is associated maintenance information. However, there rapidly becoming an increasingly difficult issue, in exists considerable variation in the scope and level of particular where data is to be exchanged with exter- detail present in each standard, and where the infor- nal entities. Although standard naming conventions mation content covered by different standards over- and forms of representation have been established, laps, their vocabulary and representation used vary significant discrepancies remain between different im- considerably. Furthermore, standards like ISO 15926 plementations of the same standards and between dif- are large yet offer little guidance on how to represent ferent standards within the same domain. Further- particular asset information in its generic data struc- more, current approaches for bridging such hetero- tures. Heterogeneities arising from continued evolu- geneities are incomplete or do not scale well to real- tion of standards have further exacerbated the prob- world problems. lem. As a consequence, tool vendors formally support We present a pragmatic approach to interoperabil- the standard, yet their implementations do not inter- ity, where ontologies are leveraged to annotate proto- operate due to different use of the same standard or typical information fragments in order to facilitate varying assumptions about the required and optional automated transformation between fragments in dif- information held in each system. -
Metamodeling and Method Engineering with Conceptbase”
This is a pre-print of the book chapter M. Jeusfeld: “Metamodeling and method engineering with ConceptBase” . In Jeusfeld, M.A., Jarke, M., Mylopoulos, J. (eds): Metamodeling for Method Engineering, pp. 89-168. The MIT Press., 2009; the original book is available from MIT Press http://mitpress.mit.edu/node/192290 This pre-print may only be used for scholar, non-commercial purposes. Most of the sources for the examples in this chapter are available via http://merkur.informatik.rwth-aachen.de/pub/bscw.cgi/3782591 for download. They require ConceptBase 7.0 or later available from http://conceptbase.cc. Metamodeling and Method Engineering with ConceptBase Manfred Jeusfeld Abstract. This chapter provides a practical guide on how to use the meta data repository ConceptBase to design information modeling methods by using meta- modeling. After motivating the abstraction principles behind meta-modeling, the language Telos as realized in ConceptBase is presented. First, a standard factual representation of statements at any IRDS abstraction level is defined. Then, the foundation of Telos as a logical theory is elaborated yielding simple fixpoint semantics. The principles for object naming, instantiation, attribution, and specialization are reflected by roughly 30 logical axioms. After the language axiomatization, user-defined rules, constraints and queries are introduced. The first part is concluded by a description of active rules that allows the specification of reactions of ConceptBase to external events. The second part applies the language features of the first part to a full-fledged information modeling method: The Yourdan method for Modern Structured Analysis. The notations of the Yourdan method are designed along the IRDS framework. -
Using Telelogic DOORS and Microsoft Visio to Model and Visualize Complex Business Processes
Using Telelogic DOORS and Microsoft Visio to Model and Visualize Complex Business Processes “The Business Driven Application Lifecycle” Bob Sherman Procter & Gamble Pharmaceuticals [email protected] Michael Sutherland Galactic Solutions Group, LLC [email protected] Prepared for the Telelogic 2005 User Group Conference, Americas & Asia/Pacific http://www.telelogic.com/news/usergroup/us2005/index.cfm 24 October 2005 Abstract: The fact that most Information Technology (IT) projects fail as a result of requirements management problems is common knowledge. What is not commonly recognized is that the widely haled “use case” and Object Oriented Analysis and Design (OOAD) phenomenon have resulted in little (if any) abatement of IT project failures. In fact, ten years after the advent of these methods, every major IT industry research group remains aligned on the fact that these projects are still failing at an alarming rate (less than a 30% success rate). Ironically, the popularity of use case and OOAD (e.g. UML) methods may be doing more harm than good by diverting our attention away from addressing the real root cause of IT project failures (when you have a new hammer, everything looks like a nail). This paper asserts that, the real root cause of IT project failures centers around the failure to map requirements to an accurate, precise, comprehensive, optimized business model. This argument will be supported by a using a brief recap of the history of use case and OOAD methods to identify differences between the problems these methods were intended to address and the challenges of today’s IT projects. -
Geographic Information Services
Note: In order to continue to make this intellectual content available to the public, the National Transportation Library (NTL) copied the contents of this document from its original HTML file. The link to the original HTML file, which was last accessed on 2020-07-29, is https://ntlrepository.blob.core.windows.net/lib/4000/4200/4256/report.html NTL has preserved this content as it was presented in the HTML file, and has checked this document to ensure as complete a representation as possible. Questions about this file can be directed through the NTL Ask a Librarian service at: https://transportation.libanswers.com/form.php?queue_id=1810 _____________________________________________________________________________________ Geographic Information Services ON THE RESULTS OF A WORKSHOP ON GENERIC DATA MODEL FOR LINEAR REFERENCING SYSTEMS Authored by Workshop Participants with Alan Vonderohe (Principal Investigator Supported by The National Cooperative Highway Research Program Project) PREFACE The first draft of this report was prepared from notes and other materials developed during the workshop, audio tape recordings of the workshop sessions, and follow-up discussions with workshop participants. All participants were given the opportunity to review the first draft and provide responses. These responses were assimilated to identify both consensus revisions to the first draft and significant points of contention. The responders were informed of these results and asked to provide their opinions on each of the points of contention. The responders were also asked to provide measures of the relative importance of their positions on each of the points of contention (i.e., "critical", "strong preference", "weak preference"). The revised model presented in this second draft is true to the model as developed during the workshop and to the revisions on which there was consensus. -
The Entity-Relationship Model — 'A3s
.M414 INST. OCT 26 1976 WORKING PAPER ALFRED P. SLOAN SCHOOL OF MANAGEMENT THE ENTITY-RELATIONSHIP MODEL — 'A3S. iflST. l^CH. TOWARD A UNIFIED VIEW OF DATA* OCT 25 197S [ BY PETER PIN-SHAN CHEN WP 839-76 MARCH, 1976 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 50 MEMORIAL DRIVE CAMBRIDGE, MASSACHUSETTS 02139 THE ENTITY-RELATIONSHIP MODEL — — A3S. iHST.TiiCH. TOWARD A UNIFIED VIEW OF DATA* OCT 25 1976 BY PETER PIN-SHAN CHEN WP 839-76 MARCH, 1976 * A revised version of this paper will appear in the ACM Transactions on Database Systems. M.I.T. LlBaARiE; OCT 2 6 1976 ' RECEIVE J I ABSTRACT: A data model, called the entity-relationship model^ls proposed. This model incorporates some of the Important semantic information in the real world. A special diagramatic technique is introduced as a tool for data base design. An example of data base design and description using the model and the diagramatic technique is given. Some implications on data integrity, information retrieval, and data manipulation are discussed. The entity-relationship model can be used as a basis for unification of different views of data: the network model, the relational model, and the entity set model. Semantic ambiguities in these models are analyzed. Possible ways to derive their views of data from the entity-relationship model are presented. KEY WORDS AND PHRASES: data base design, logical view of data, semantics of data, data models, entity-relationship model, relational model. Data Base Task Group, network model, entity set model, data definition and manipulation, data integrity and consistency. CR CATEGORIES: 3.50, 3.70, 4.33, 4.34. -
Metamodeling the Enhanced Entity-Relationship Model
Metamodeling the Enhanced Entity-Relationship Model Robson N. Fidalgo1, Edson Alves1, Sergio España2, Jaelson Castro1, Oscar Pastor2 1 Center for Informatics, Federal University of Pernambuco, Recife(PE), Brazil {rdnf, eas4, jbc}@cin.ufpe.br 2 Centro de Investigación ProS, Universitat Politècnica de València, València, España {sergio.espana,opastor}@pros.upv.es Abstract. A metamodel provides an abstract syntax to distinguish between valid and invalid models. That is, a metamodel is as useful for a modeling language as a grammar is for a programming language. In this context, although the Enhanced Entity-Relationship (EER) Model is the ”de facto” standard modeling language for database conceptual design, to the best of our knowledge, there are only two proposals of EER metamodels, which do not provide a full support to Chen’s notation. Furthermore, neither a discussion about the engineering used for specifying these metamodels is presented nor a comparative analysis among them is made. With the aim at overcoming these drawbacks, we show a detailed and practical view of how to formalize the EER Model by means of a metamodel that (i) covers all elements of the Chen’s notation, (ii) defines well-formedness rules needed for creating syntactically correct EER schemas, and (iii) can be used as a starting point to create Computer Aided Software Engineering (CASE) tools for EER modeling, interchange metadata among these tools, perform automatic SQL/DDL code generation, and/or extend (or reuse part of) the EER Model. In order to show the feasibility, expressiveness, and usefulness of our metamodel (named EERMM), we have developed a CASE tool (named EERCASE), which has been tested with a practical example that covers all EER constructors, confirming that our metamodel is feasible, useful, more expressive than related ones and correctly defined. -
A Framework for GIS-T Data Sharing
Portland State University PDXScholar Center for Urban Studies Publications and Reports Center for Urban Studies 6-2000 A Framework for GIS-T Data Sharing Kenneth Dueker Portland State University J. Allison Butler Follow this and additional works at: https://pdxscholar.library.pdx.edu/cus_pubs Part of the Transportation Commons, and the Urban Studies and Planning Commons Let us know how access to this document benefits ou.y Citation Details Dueker, Kenneth and Butler, J. Allison, "A Framework for GIS-T Data Sharing" (2000). Center for Urban Studies Publications and Reports. 13. https://pdxscholar.library.pdx.edu/cus_pubs/13 This Working Paper is brought to you for free and open access. It has been accepted for inclusion in Center for Urban Studies Publications and Reports by an authorized administrator of PDXScholar. Please contact us if we can make this document more accessible: [email protected]. A Framework for GIS-T Data Sharing Kenneth J. Dueker Portland State University J. Allison Butler Hamilton County Tennessee December 1999 (Revised June 2000) Submitted to Transportation Research Part C: Emerging Technologies Center for Urban Studies Portland State University Portland OR 97207-0751 www.upa.pdx.edu/cus/ Abstract This paper develops a framework and principles for sharing of transportation data. The framework is intended to clarify roles among participants, data producers, data integrators, and data users. The principles are intended to provide guidance for the participants. Both the framework and the principles are based on an enterprise GIS-T data model that defines relations among transportation data elements. The data model guards against ambiguities and provides a basis for the development of the framework and principles for sharing of transportation data. -
A CASE STUDY EXAMINATION DATA MODELLING in PRACTICE Paul
A CASE STUDY EXAMINATION DATA MODELLING IN PRACTICE Paul Groves A report submitted in partial fulfilment of the requirements of the degree of Master of Commerce (Honours) to the University of New South Wales 1988 CERTIFICATION "I hereby declare that this submission is my own work and that, to the best of my knowledge and belief, it contains no material previously published or written by another person nor material which to a substantial extent has been accepted for the award of any other degree or diploma of a University or any other institute of higher learning, except where due acknowledgement is made in the text." ABSTRACT Data modelling for analysis and data base design is increasingly being viewed as a critical phase in the systems development process. This report is a comparative analysis of data modelling theory and practice. It investigates the nature of data and examines several data modelling methododologies. Current international standards for the conceptual data model are reviewed and based on this a reference framework is defined. This framework is used to compare four contemporary data modelling theories. Field testing of three of the methods is conducted, two case studies from a commercial environment and one from an academic setting. The case studies are conducted on a descriptive research basis. Results from the case studies confirm that data modelling represents a technique of growing impor tance in the systems development process. Increasing resources applied to the practice of relational database should ensure ensure ongoing theoretical interest and development. Although in the for mative stages of implementation and use, binary data modelling was seen to have achieved notable sucess in enhancing communication between project participants and in increasing user participation.