Data Modeling Zone Europe September 28-29 2015 in Hamburg, Germany

Total Page:16

File Type:pdf, Size:1020Kb

Data Modeling Zone Europe September 28-29 2015 in Hamburg, Germany Data Modeling Zone Europe September 28-29 2015 in Hamburg, Germany Data Modeling Zone 2015 fundamental (for all audiences), Monday, September 28 intermediate, advanced 7:00-9:00 Breakfast and Registration 9:00-12:00 Introduction Data Governance for Data Modeling by Advanced Data Workshop to the Data Modelers Example - Modeling Data Vault Workshop Introduction and Challenges Approach Deborah Henderson, Workshop Workshop Kasper de Graaf, Broadstreet Data Marco Wobben, Steve Hoberman, [Netherlands] [Canada] BCP Software Steve Hoberman & Hamburg, Pg 1 Wismar, Pg 1 [Netherlands] Associates, LLC Lübeck, Pg 1 [United States] Rostock, Pg 2 12:00-1:15 Lunch 1:15-1:30 Welcome and Announcements 1:30-2:30 KEYNOTE: Data Modeler 2020 – Future of Data Modeling Panel, Lars Rönnbäck, Dan Linstedt, Hans Hultgren Hamburg, Pg 3 2:30-3:30 Introduction to Data Vault 2.0 and Modeling the Data as a Service Anchor Modeling Disciplined Agile BRAIN – concepts – Evolution, Lars Rönnbäck, Delivery and challenges for Architecture and Dan Linstedt, a new BI Up To Change Design Principles [Sweden] Empowered Holdings, architecture LLC [United States] Pushpak Sarkar, Rostock, Pg 3 Speakers to be Hamburg, Pg 3 New York Life announced Insurance [United shortly States] Lübeck Wismar, Pg 4 3:30-4:00 Afternoon Snacks 4:00-5:00 Ensemble Modeling Evaluating Data Big Data at BT (A Keeping Multi- Hans Hultgren, Modeling Tools? Game of Temporal History Genesee Academy Helping You to Elephants) in Data [United States] Decide Phill Radley, BT Warehouses – Hamburg, Pg 5 George McGeachie, [United Kingdom] Challenges, Metadata Matters Lübeck, Pg 6 Chances and Best [United Kingdom] Practices Wismar, Pg 5 Dr. Michael Hahne [Germany] Rostock, Pg 7 Page 2 Data Modeling Zone 2015 fundamental (for all audiences), Tuesday, September 29 intermediate, advanced 7:00-9:00 Breakfast and Registration 8:00-8:30 Session to be Session to be Session to be Session to be announced in May announced in June announced in July announced in Special August Interest Groups (SIGs) 9:00-12:00 Facilitation and From Source Data to The Data Advanced Anchor Training Skills for Visual BI, a Client Modeler’s Road to Modeling Data Professionals Case Study the Certified Data Workshop Artie Mahal, ASM Dirk Lerner, ITGAIN, Management Lars Rönnbäck, Group Inc. [United and Andreas Wiener, Professional Up To Change States] reportingimpulse [Sweden] (CDMP) Rostock, Pg 8 [Germany] Wismar, Page Hamburg, Pg 9 Patricia Cupoli, 10 CCP, CDMP, CBIP, DAMA International [United States] Lübeck, Pg 10 12:00-1:15 Lunch 1:15-1:30 Welcome and Announcements 1:30-2:30 KEYNOTE: Crossing the Unstructured Barrier, Bill Inmon, Forest Rim Technologies Hamburg, Pg 11 2:30-3:30 Data Models, DMBOK Overview Big Data Modeling Championing the Transformations, Deborah Henderson, Hans Hultgren, Data Model Broadstreet Data Genesee Academy Scorecard within and End-user [Canada] [United States] your Organization Participation Wismar, Pg 13 Hamburg, Pg 13 Steve Hoberman, Peter Alons, PAlCon Steve Hoberman & Associates, LLC and Rob Arntz, Atos [United States] [Netherlands] Rostock, Pg 14 Lübeck, Pg 11 3:30-4:00 Afternoon Snacks 4:00-5:00 Pharma Data Making Your Metadata driven Advanced Data Modeling Case Unstructured Data Anchor Model Vault Design Study Come Alive generation: a Dan Linstedt, Speakers to be Bill Inmon, Forest unifying approach Empowered announced shortly Rim Technologies Tim Schiettekatte, Holdings, LLC [United States] [United States] Wismar ChainPoint Lübeck, Pg 14 [Germany] Hamburg, Pg 15 Rostock, Pg 15 Page 3 Data Modeling Zone 2015 Introduction Workshop to the Data Governance as a framework Data Vault Approach What modelers are already doing - and the connection to data governance Kasper de Graaf Modeling in context - architecture, design, operations This session will focus on an introduction to What’s important and when - Data Vault Modeling, both from a business principles based modeling and from a technical / modeling perspective. We will complete ‘hands on’ exercises: The theoretical background is followed by a case study where the participants can build Estimating templates – from model their own Data Vault model. discovery to model creation Reporting on modeling activity and governance scorecards Organizing for Quality, and modelers place in this Kasper de Graaf (Occurro) is an independent data warehouse architect, trainer and modeler. He specializes in data warehouse architecture, data modeling (data vault and dimensional modeling) and BI/DWH development, Deborah Henderson, B.Sc., MLS, PMP, CDMP preferably using open source technology. Before is the Data Governance Practice Manager for starting his own company he worked for Broadstreet Data in Toronto and teaches data several consulting and training companies and governance fundamentals classes publicly and did projects for numerous organizations in privately. She is Program Director for the Europe during the last 20 years. DAMA-DMBOK (Data Management Body of Knowledge), a global effort on the since 2005. With over 25 years in data management, she consults in data governance in the energy, Data Governance for Data capital markets, heath and automotive sectors. Modelers Workshop Deborah Henderson, Broadstreet Data Data Modeling by Example - Data Governance may seem like a faraway Introduction and Workshop topic to data modelers: too strategic to have much direct impact on a modeler’s work. This Marco Wobben, BCP Software seminar will connect the strategy directly to In many DMZ presentations, data modeling is the modeler’s work and show the benefits of described as both a craft and an art. For most data governance in a systemic way as outsiders, data modeling is some kind of modelers always knew systems development magic: the data modeler interviews business really works. experts, studies piles of requirements, talks In this workshop we will look at: Page 1 Data Modeling Zone 2015 some more, and then, hocus pocus, presents a financial experts, software to remotely operate diagram with boxes, crow’s feet, arrows, etc. bridges and a wide range of web applications. For the past 10 years, he has been the main Fact based information modeling is the very developer of CaseTalk, the CASE tool for fact opposite of such magic. It does not require based information modeling, which is widely people to understand the modeler’s magical used in universities in the Netherlands and language of boxes and arrows. Instead, fact beyond. based information modeling uses natural language to describe sample facts that are intelligible for business people. Therefore, it is also known as “Data Modeling by Example”. Advanced Data Modeling Challenges Workshop Part 1 – Introduction Steve Hoberman, Steve Hoberman & The presentation highlights: Associates, LLC the origins and key elements of fact After you are comfortable with data modeling based modeling; terminology and have built a number of data its place in the requirements models, often the way to continuously sharpen engineering process; your skills is to take on more challenging usage in large scale information assignments. Join us for a half day of tackling management; real world data modeling scenarios. We will some forward engineering capabilities complete at least ten challenges covering these (ER, UML); four areas: NoSQL data modeling Part 2 – Workshop Agile and data modeling Abstraction In the workshop you will install a free Advanced relational and dimensional modeling modeling tool on your own windows computer and practice with: Join us as in groups as we solve and discuss a set of model scenarios. verbalizing elementary facts; modeling with fact expressions; visualizing and validating your model; generating output for business experts and software engineers. Steve Hoberman taught his first data modeling class in 1992 and has trained more than 10,000 people since then, spanning every continent except Africa and Antarctica. Steve is known for his entertaining and interactive Marco Wobben is partner of BCP Software and teaching style (watch out for flying candy!), has been developing software for more than 30 and organizations around the globe have years. He has developed applications for brought Steve in to teach his Data Modeling Page 2 Data Modeling Zone 2015 Master Class, which is recognized as the most implementations, along with awarded Open comprehensive data modeling course in the Source tools and scientific papers, the industry. Steve is the author of seven books on technique is now gaining worldwide data modeling, including the bestseller Data momentum. Anchor Modeling can be Modeling Made Simple. His latest book, Data implemented in traditional relational Modeling for MongoDB, presents a streamlined databases and it is based on current research approach to data modeling for NoSQL in entity relationship modeling, database solutions. One of Steve’s frequent data normalization and temporal databases. The modeling consulting assignments is to review presentation will give you an introduction to data models using his Data Model Scorecard® Anchor Modeling, after which you will be able technique. He is the founder of the Design to create your own models. Challenges group, Conference Chair of the Data Modeling Zone conference, recipient of the 2012 Data Administration Management Association (DAMA) International Professional Achievement Award, and highest rated presenter at Enterprise Data World 2015. Data Modeler 2020 – Future of Data Modeling Panel Lars Rönnbäck, Dan Linstedt, Hans
Recommended publications
  • Sixth Normal Form
    3 I January 2015 www.ijraset.com Volume 3 Issue I, January 2015 ISSN: 2321-9653 International Journal for Research in Applied Science & Engineering Technology (IJRASET) Sixth Normal Form Neha1, Sanjeev Kumar2 1M.Tech, 2Assistant Professor, Department of CSE, Shri Balwant College of Engineering &Technology, DCRUST University Abstract – Sixth Normal Form (6NF) is a term used in relational database theory by Christopher Date to describe databases which decompose relational variables to irreducible elements. While this form may be unimportant for non-temporal data, it is certainly important when maintaining data containing temporal variables of a point-in-time or interval nature. With the advent of Data Warehousing 2.0 (DW 2.0), there is now an increased emphasis on using fully-temporalized databases in the context of data warehousing, in particular with next generation approaches such as Anchor Modeling . In this paper, we will explore the concepts of temporal data, 6NF conceptual database models, and their relationship with DW 2.0. Further, we will also evaluate Anchor Modeling as a conceptual design method in which to capture temporal data. Using these concepts, we will indicate a path forward for evaluating a possible translation of 6NF-compliant data into an eXtensible Markup Language (XML) Schema for the purpose of describing and presenting such data to disparate systems in a structured format suitable for data exchange. Keywords :, 6NF,SQL,DKNF,XML,Semantic data change, Valid Time, Transaction Time, DFM I. INTRODUCTION Normalization is the process of restructuring the logical data model of a database to eliminate redundancy, organize data efficiently and reduce repeating data and to reduce the potential for anomalies during data operations.
    [Show full text]
  • The Design of Multidimensional Data Model Using Principles of the Anchor Data Modeling: an Assessment of Experimental Approach Based on Query Execution Performance
    WSEAS TRANSACTIONS on COMPUTERS Radek Němec, František Zapletal The Design of Multidimensional Data Model Using Principles of the Anchor Data Modeling: An Assessment of Experimental Approach Based on Query Execution Performance RADEK NĚMEC, FRANTIŠEK ZAPLETAL Department of Systems Engineering Faculty of Economics, VŠB - Technical University of Ostrava Sokolská třída 33, 701 21 Ostrava CZECH REPUBLIC [email protected], [email protected] Abstract: - The decision making processes need to reflect changes in the business world in a multidimensional way. This includes also similar way of viewing the data for carrying out key decisions that ensure competitiveness of the business. In this paper we focus on the Business Intelligence system as a main toolset that helps in carrying out complex decisions and which requires multidimensional view of data for this purpose. We propose a novel experimental approach to the design a multidimensional data model that uses principles of the anchor modeling technique. The proposed approach is expected to bring several benefits like better query execution performance, better support for temporal querying and several others. We provide assessment of this approach mainly from the query execution performance perspective in this paper. The emphasis is placed on the assessment of this technique as a potential innovative approach for the field of the data warehousing with some implicit principles that could make the process of the design, implementation and maintenance of the data warehouse more effective. The query performance testing was performed in the row-oriented database environment using a sample of 10 star queries executed in the environment of 10 sample multidimensional data models.
    [Show full text]
  • Star Schema Modeling with Pentaho Data Integration
    Star Schema Modeling With Pentaho Data Integration Saurischian and erratic Salomo underworked her accomplishment deplumes while Phil roping some diamonds believingly. Torrence elasticize his umbrageousness parsed anachronously or cheaply after Rand pensions and darn postally, canalicular and papillate. Tymon trodden shrinkingly as electropositive Horatius cumulates her salpingectomies moat anaerobiotically. The email providers have a look at pentaho user console files from a collection, an individual industries such processes within an embedded saiku report manager. The database connections in data modeling with schema. Entity Relationship Diagram ERD star schema Data original database creation. For more details, the proposed DW system ran on a Windowsbased server; therefore, it responds very slowly to new analytical requirements. In this section we'll introduce modeling via cubes and children at place these models are derived. The data presentation level is the interface between the system and the end user. Star Schema Modeling with Pentaho Data Integration Tutorial Details In order first write to XML file we pass be using the XML Output quality This is. The class must implement themondrian. Modeling approach using the dimension tables and fact tables 1 Introduction The use. Data Warehouse Dimensional Model Star Schema OLAP Cube 5. So that will not create a lot when it into. But it will create transformations on inventory transaction concepts, integrated into several study, you will likely send me? Thoughts on open Vault vs Star Schemas the bi backend. Table elements is data integration tool which are created all the design to the farm before with delivering aggregated data quality and data is preventing you.
    [Show full text]
  • Translating Data Between Xml Schema and 6Nf Conceptual Models
    Georgia Southern University Digital Commons@Georgia Southern Electronic Theses and Dissertations Graduate Studies, Jack N. Averitt College of Spring 2012 Translating Data Between Xml Schema and 6Nf Conceptual Models Curtis Maitland Knowles Follow this and additional works at: https://digitalcommons.georgiasouthern.edu/etd Recommended Citation Knowles, Curtis Maitland, "Translating Data Between Xml Schema and 6Nf Conceptual Models" (2012). Electronic Theses and Dissertations. 688. https://digitalcommons.georgiasouthern.edu/etd/688 This thesis (open access) is brought to you for free and open access by the Graduate Studies, Jack N. Averitt College of at Digital Commons@Georgia Southern. It has been accepted for inclusion in Electronic Theses and Dissertations by an authorized administrator of Digital Commons@Georgia Southern. For more information, please contact [email protected]. 1 TRANSLATING DATA BETWEEN XML SCHEMA AND 6NF CONCEPTUAL MODELS by CURTIS M. KNOWLES (Under the Direction of Vladan Jovanovic) ABSTRACT Sixth Normal Form (6NF) is a term used in relational database theory by Christopher Date to describe databases which decompose relational variables to irreducible elements. While this form may be unimportant for non-temporal data, it is certainly important for data containing temporal variables of a point-in-time or interval nature. With the advent of Data Warehousing 2.0 (DW 2.0), there is now an increased emphasis on using fully-temporalized databases in the context of data warehousing, in particular with approaches such as the Anchor Model and Data Vault. In this work, we will explore the concepts of temporal data, 6NF conceptual database models, and their relationship with DW 2.0. Further, we will evaluate the Anchor Model and Data Vault as design methods in which to capture temporal data.
    [Show full text]
  • Data Vault and 'The Truth' About the Enterprise Data Warehouse
    Data Vault and ‘The Truth’ about the Enterprise Data Warehouse Roelant Vos – 04-05-2012 Brisbane, Australia Introduction More often than not, when discussion about data modeling and information architecture move towards the Enterprise Data Warehouse (EDW) heated discussions occur and (alternative) solutions are proposed supported by claims that these alternatives are quicker and easier to develop than the cumbersome EDW while also delivering value to the business in a more direct way. Apparently, the EDW has a bad image. An image that seems to be associated with long development times, high complexity, difficulties in maintenance and a falling short on promises in general. It is true that in the past many EDW projects have suffered from stagnation during the data integration (development) phase. These projects have stalled in a cycle of changes in ETL and data model design and the subsequent unit and acceptance testing. No usable information is presented to the business users while being stuck in this vicious cycle and as a result an image of inability is painted by the Data Warehouse team (and often at high costs). Why are things this way? One of the main reasons is that the Data Warehouse / Business Intelligence industry defines and works with The Enterprise Data architectures that do not (can not) live up to their promises. To date Warehouse has a bad there are (still) many Data Warehouse specialists who argue that an image while in reality it is EDW is always expensive, complex and monolithic whereas the real driven by business cases, message should be that an EDW is in fact driven by business cases, adaptive and quick to adaptive and quick to deliver.
    [Show full text]
  • Assessment of Helical Anchors Bearing Capacity for Offshore Aquaculture Applications
    The University of Maine DigitalCommons@UMaine Electronic Theses and Dissertations Fogler Library Summer 8-23-2019 Assessment of Helical Anchors Bearing Capacity for Offshore Aquaculture Applications Leon D. Cortes Garcia University of Maine, [email protected] Follow this and additional works at: https://digitalcommons.library.umaine.edu/etd Recommended Citation Cortes Garcia, Leon D., "Assessment of Helical Anchors Bearing Capacity for Offshore Aquaculture Applications" (2019). Electronic Theses and Dissertations. 3094. https://digitalcommons.library.umaine.edu/etd/3094 This Open-Access Thesis is brought to you for free and open access by DigitalCommons@UMaine. It has been accepted for inclusion in Electronic Theses and Dissertations by an authorized administrator of DigitalCommons@UMaine. For more information, please contact [email protected]. ASSESSMENT OF HELICAL ANCHORS BEARING CAPACITY FOR OFFSHORE AQUACULTURE APPLICATIONS By Leon D. Cortes Garcia B.S. National University of Colombia, 2016 A THESIS Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science (in Civil Engineering) The Graduate School The University of Maine August 2019 Advisory Committee: Aaron P. Gallant, Professor of Civil Engineering, Co‐Advisor Melissa E. Landon, Professor of Civil Engineering, Co‐Advisor Kimberly D. Huguenard, Professor of Civil Engineering Copyright 2019 Leon D. Cortes Garcia All Rights Reserved ii ASSESSMENT OF HELICAL ANCHORS BEARING CAPACITY FOR OFFSHORE AQUACULTURE APPLICATIONS. By Leon Cortes Garcia Thesis Advisor(s): Dr. Aaron Gallant, Dr. Melissa Landon An Abstract of the Thesis Presented in Partial Fulfillment of the Requirements for the Degree of Master of Science (in Civil Engineering) August 2019 Aquaculture in Maine is an important industry with expected growth in the coming years to provide food in an ecological and environmentally sustainable way.
    [Show full text]
  • Sql Query Optimization for Highly Normalized Big Data
    DECISION MAKING AND BUSINESS INTELLIGENCE SQL QUERY OPTIMIZATION FOR HIGHLY NORMALIZED BIG DATA Nikolay I. GOLOV Lecturer, Department of Business Analytics, School of Business Informatics, Faculty of Business and Management, National Research University Higher School of Economics Address: 20, Myasnitskaya Street, Moscow, 101000, Russian Federation E-mail: [email protected] Lars RONNBACK Lecturer, Department of Computer Science, Stocholm University Address: SE-106 91 Stockholm, Sweden E-mail: [email protected] This paper describes an approach for fast ad-hoc analysis of Big Data inside a relational data model. The approach strives to achieve maximal utilization of highly normalized temporary tables through the merge join algorithm. It is designed for the Anchor modeling technique, which requires a very high level of table normalization. Anchor modeling is a novel data warehouse modeling technique, designed for classical databases and adapted by the authors of the article for Big Data environment and a massively parallel processing (MPP) database. Anchor modeling provides flexibility and high speed of data loading, where the presented approach adds support for fast ad-hoc analysis of Big Data sets (tens of terabytes). Different approaches to query plan optimization are described and estimated, for row-based and column- based databases. Theoretical estimations and results of real data experiments carried out in a column-based MPP environment (HP Vertica) are presented and compared. The results show that the approach is particularly favorable when the available RAM resources are scarce, so that a switch is made from pure in-memory processing to spilling over from hard disk, while executing ad-hoc queries.
    [Show full text]
  • Methods Used in Tieback Wall Design and Construction to Prevent Local Anchor Failure, Progressive Anchorage Failure, and Ground Mass Stability Failure Ralph W
    ERDC/ITL TR-02-11 Innovations for Navigation Projects Research Program Methods Used in Tieback Wall Design and Construction to Prevent Local Anchor Failure, Progressive Anchorage Failure, and Ground Mass Stability Failure Ralph W. Strom and Robert M. Ebeling December 2002 echnology Laboratory Information T Approved for public release; distribution is unlimited. Innovations for Navigation Projects ERDC/ITL TR-02-11 Research Program December 2002 Methods Used in Tieback Wall Design and Construction to Prevent Local Anchor Failure, Progressive Anchorage Failure, and Ground Mass Stability Failure Ralph W. Strom 9474 SE Carnaby Way Portland, OR 97266 Concord, MA 01742-2751 Robert M. Ebeling Information Technology Laboratory U.S. Army Engineer Research and Development Center 3909 Halls Ferry Road Vicksburg, MS 39180-6199 Final report Approved for public release; distribution is unlimited. Prepared for U.S. Army Corps of Engineers Washington, DC 20314-1000 Under INP Work Unit 33272 ABSTRACT: A local failure that spreads throughout a tieback wall system can result in progressive collapse. The risk of progressive collapse of tieback wall systems is inherently low because of the capacity of the soil to arch and redistribute loads to adjacent ground anchors. The current practice of the U.S. Army Corps of Engineers is to design tieback walls and ground anchorage systems with sufficient strength to prevent failure due to the loss of a single ground anchor. Results of this investigation indicate that the risk of progressive collapse can be reduced
    [Show full text]
  • Data Model Matrix
    Delivery Source Validation Central Facts Generic Data Access Tool Access Data Model Matrix Layer: abstraction Data logistics Notation Information (Target) Generic Specific Concern Data Delivery Logical External Facts Enhancement Target Data Mart/ & Access Approach (Source) Integration models Information Levels of Concerns Concerns Artefacts abstraction Validation model models Perspectives Analysis Tool representation Theory Systems Realities § Source System/ § Data exchange § Source System or data § Collection of all UoD’s § Data Integration § Plausibility § Representation § (Temporal) filtering and § Queryable --------------- platform § Data format delivery validation § Structure stability § Data correlation § Cross model validation (Access) selection § Navigation Data definition Data delivery process / § Formal delivery § Validation against logical model § Fact uniformity § Data homogenization § Data enrichment & § Manipulation (CRUD) § Language § Aggregation specification § Data delivery § Data standardization § Minimally constrained§ Unification of concepts and context correlation § Integrity § Type transformation § Tooling Execution level concerns data layer concerns abstraction (for transformation) Temporally flexible § Source model IST to SOLL (Constraints) § Data Application § Performance § Data representation (model repair) § Derivation/ Processing Interface § Scalability § Data packaging transformation (DAPI) - Informal definition - Comprehension § Legal Reports, Dossiers Informal language Model Canvas Narrative Compendium Data
    [Show full text]
  • Anchor Modeling TDWI Presentation
    1 _tÜá e≠ÇÇuùv~ Anchor Modeling I N T H E D A T A W A R E H O U S E Did you know that some of the earliest prerequisites for data warehousing were set over 2500 years ago. It is called an anchor model since the anchors tie down a number of attributes (see picture above). All EER-diagrams have been made with Graphviz. All cats are drawn by the author, Lars Rönnbäck. 1 22 You can never step into the same river twice. The great greek philosopher Heraclitus said ”You can never step into the same river twice”. What he meant by that is that everything is changing. The next time you step into the river other waters are flowing by. Likewise the environment surrounding a data warehouse is in constant change and whenever you revisit them you have to adapt to these changes. Image painted by Henrik ter Bruggen courtsey of Wikipedia Commons (public domain). 2 3 Five Essential Criteria • A future-proof data warehouse must at least fulfill: – Value – Maintainability – Usability – Performance – Flexibility Fail in one and there will be consequences Value – most important, even a very poorly designed data warehouse can survive as long as it is providing good business value. If you fail in providing value, the warehouse will be viewed as a money sink and may be cancelled altogether. Maintainability – you should be able to answer the question: How is the warehouse feeling today? Could be healthy, could be ill! Detect trends, e g in loading times. Usability – must be simple and accessible to the end users.
    [Show full text]
  • Anchor Modelling
    Anchor modeling Beter dan Data Vault? Dit artikel is oorspronkelijk gepubliceerd in Database Magazine 8 / 2009, met andere lay-out en titel. Het copyright voor die versie berust dus bij Array Publishers. Verzoeken voor overname van dit materiaal gelieve men dus aan Array te richten. Grundsätzlich IT M: 06-41966000 W: www.grundsatzlich-it.nl E: [email protected] R. Kunenborg Versie 1.31 / 26 augustus 2009 / Final Contents Anchor Modeling ..................................................................................................................................... 3 Inleiding ............................................................................................................................................... 3 Geschiedenis ....................................................................................................................................... 3 Anchor Modeling ................................................................................................................................. 3 De zesde normaalvorm (6NF) .......................................................................................................... 3 Anchor Modeling ............................................................................................................................. 4 Sleutelmanagement ........................................................................................................................ 7 Views en functies ............................................................................................................................
    [Show full text]
  • Agile Information Modeling in Evolving Data Environments
    Anchor Modeling { Agile Information Modeling in Evolving Data Environments L. R¨onnb¨ack Resight, Kungsgatan 66, 101 30 Stockholm, Sweden O. Regardt Teracom, Kakn¨astornet,102 52 Stockholm, Sweden M. Bergholtz, P. Johannesson, P. Wohed DSV, Stockholm University, Forum 100, 164 40 Kista, Sweden Abstract Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this article, we propose an agile information modeling technique, called Anchor Modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes. A key benefit of Anchor Modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. Such changes, therefore, do not require immediate modifications of existing applications, since all previous versions of the database schema are available as subsets of the current schema. Anchor Modeling decouples the evolution and application of a database, which when building a data warehouse enables shrinking of the initial project scope. While data models were previously made to capture every facet of a domain in a single phase of development, in Anchor Modeling fragments can be iteratively modeled and applied. We provide a formal and technology independent definition of anchor models and show how anchor models can be realized as relational databases together with examples of schema evolution. We also investigate performance through a number of lab experiments, which indicate that under certain conditions anchor databases perform substantially better than databases constructed using traditional modeling techniques.
    [Show full text]