The Importance of a Single Platform for Data Integration and Quality Management

Total Page:16

File Type:pdf, Size:1020Kb

The Importance of a Single Platform for Data Integration and Quality Management helping build the smart and agile business The Importance of a Single Platform for Data Integration and Quality Management Colin White BI Research March 2008 Sponsored by Business Objects The Importance of a Single Platform for Data Integration and Quality Management TABLE OF CONTENTS DATA INTEGRATION AND QUALITY: UNDERSTANDING THE PROBLEM 1 The Evolution of Data Integration and Quality Software 1 Building a Single Data Services Architecture 3 Applications 3 Service-Oriented Architecture Layer 4 Data Services Techniques 5 Data Services Management and Operations 5 Choosing Data Services Products 6 BUSINESS OBJECTS DATA SERVICES PLATFORM 7 BusinessObjects Data Services XI 3.0 7 Getting Started: Success Factors 9 Brand and product names mentioned in this paper may be the trademarks or registered trademarks of their respective owners. BI Research The Importance of a Single Platform for Data Integration and Quality Management DATA INTEGRATION AND QUALITY: UNDERSTANDING THE PROBLEM Companies are fighting a constant battle to integrate business data and content while managing data quality in their organizations. Compounding this difficulty is the growing use of workgroup computing and Web technologies, the storing of more data and content online, and the need to retain information longer for compliance reasons. These trends are causing data volumes to increase dramatically. The growing number Rising volumes are not the only cause of data integration and quality issues, of data sources is however. The growing numbers of disparate systems that produce and distribute data causing data and content also add to the complexity of the data integration and quality integration problems management environment. Business mergers and acquisitions only exacerbate the situation. Data quality Most organizations use a variety of software products to handle the integration of management in many disparate data and content, and to manage data quality. Often, custom solutions are companies is required for complex and legacy data environments. Although data integration immature projects have grown rapidly, budget and time pressures often lead to data quality issues being ignored by project developers. The result is that data quality management has not kept pace with the growth of data integration projects, and its use in many companies is still immature. Vendors are now Business user complaints and compliance legislation are forcing IT groups to devote providing more energy and resources to solving data quality problems. Nevertheless, many data consolidated data quality projects are still implemented separately from those for data integration. One integration and data reason for this is that in the past data quality tools have been developed and marketed quality products by a different set of vendors than those that supply data integration products. This has led to fractured purchasing strategies and skills development in IT groups. Vendor acquisitions and mergers have led to consolidated solutions, but product integration issues still remain. A data services If companies are to manage the integration and quality of the ever-increasing architecture is information mountain in their organizations, they need to design and build a data required for services architecture that provides a single environment for enterprise-wide business enterprise-wide data data and content integration and quality management (see Figure 1). This paper integration and examines the evolution of the data integration and quality industry, and explains the quality management benefits of moving toward a single data services architecture. It outlines requirements for a software platform for supporting such an architecture, and, as an example, reviews the BusinessObjects™ Data Services XI Release 3 platform from Business Objects, an SAP company. THE EVOLUTION OF DATA INTEGRATION AND QUALITY SOFTWARE Although data integration and quality problems have been widespread in companies throughout the history of computing, they deteriorated noticeably when organizations moved away from centralized systems to using distributed processing involving BI Research 1 The Importance of a Single Platform for Data Integration and Quality Management client/server computing, and more recently, Web-based systems. While there is no question that the move toward distributed processing systems improved access to data, which in turn enhanced business user decision-making and action-taking, it nevertheless increased the complexity of data integration and quality management tasks in organizations. Figure 1. Data Services: a Single Environment for Data Integration and Quality Management Data warehousing Improvements in data integration and quality came with the introduction of data and BI projects have warehousing and business intelligence (BI). The business intelligence market has helped improve data seen tremendous growth, and for many organizations business intelligence has quality become a key asset that enables them to optimize BI operations to reduce costs and maintain a competitive advantage. Business intelligence applications in these companies have become mission-critical because of the important role they play in the decision-making process. This reliance will grow as companies move toward using business intelligence, not only for strategic and tactical decision-making, but also for driving daily and intraday business operations. Data integration and The use of data warehousing and business intelligence has led to a much better quality management understanding of how business data flows through the business and how it is used to is an enterprise-wide make decisions. This is especially true for legacy system data, which is often poorly problem documented. This understanding is helping organizations deploy other data integration and quality projects that may not be directly related to business intelligence. Master data management is an example here. The result is that more organizations are now viewing data integration and quality as an enterprise-wide problem, not just an issue to be solved when building a data warehouse and business intelligence applications. BI Research 2 The Importance of a Single Platform for Data Integration and Quality Management An enterprise-wide Although companies have increased their spending on data integration and quality data services products, a single enterprise-wide data services solution has often remained elusive environment has due to the complexity of the tasks involved, and also because of the lack of a remained elusive consistent approach to information management across the enterprise. The solution is to develop an enterprise data services architecture, deploy a single and open data services platform to support this architecture, fill any gaps in the platform with third-party or custom-built software, and gradually evolve existing data integration and data quality projects to support the new data services environment. Six key aspects of an The main characteristics of an enterprise-wide data services architecture are as enterprise-wide data follows: services architecture • A single environment for data integration and data quality management • A common developer user interface and workbench • A single set of source data and content acquisition adapters • Shared and reusable data integration and data quality cleansing transforms • A single operations console and runtime environment • Shared metadata and metadata management services Many benefits to Although it will take time for organizations to move toward a single data services having a single data environment that supports both data integration and data quality management there services environment are significant benefits to doing so: • Organizations are more effective and competitive because they have access to consistent and trusted data • IT architecture is simpler, which reduces IT maintenance and development costs • Development cycle time is reduced due to a common data integration and data quality management environment • Data standards are easier to enforce and maintain because data integration and data quality processes can be shared and reused across projects BUILDING A SINGLE DATA SERVICES ARCHITECTURE Figure 2 illustrates the key requirements for building a single enterprise-wide data services environment. These requirements fall into four main areas: applications, application interfaces, techniques, and management. Applications The applications component represents those business applications that require data services for improving data quality and integrating data and content. Business transaction processing, master data management and business intelligence are key examples here. The move toward a service-oriented architecture (SOA) based on Web services is adding applications such as business content management and business collaboration to the applications mix. BI Research 3 The Importance of a Single Platform for Data Integration and Quality Management Figure 2. Data Services Requirements Service-Oriented Architecture Layer Most data quality and integration projects involve batch applications that gather data from multiple sources, clean and integrate it, and then load the results into a target data file or database. With demand growing for lower-latency data and a services- based architecture, this model of data integration processing must be enhanced and made more dynamic. Developers need a Developers now want to build applications that can use data services interactively, set of dynamic
Recommended publications
  • Managing Data in Motion This Page Intentionally Left Blank Managing Data in Motion Data Integration Best Practice Techniques and Technologies
    Managing Data in Motion This page intentionally left blank Managing Data in Motion Data Integration Best Practice Techniques and Technologies April Reeve AMSTERDAM • BOSTON • HEIDELBERG • LONDON NEW YORK • OXFORD • PARIS • SAN DIEGO SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO Morgan Kaufmann is an imprint of Elsevier Acquiring Editor: Andrea Dierna Development Editor: Heather Scherer Project Manager: Mohanambal Natarajan Designer: Russell Purdy Morgan Kaufmann is an imprint of Elsevier 225 Wyman Street, Waltham, MA 02451, USA Copyright r 2013 Elsevier Inc. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions. This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein). Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods or professional practices, may become necessary. Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information or methods described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.
    [Show full text]
  • Oracle Warehouse Builder Concepts Guide
    Oracle® Warehouse Builder Concepts 11g Release 2 (11.2) E10581-02 August 2010 Oracle Warehouse Builder Concepts, 11g Release 2 (11.2) E10581-02 Copyright © 2000, 2010, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws. Except as expressly permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit, perform, publish, or display any part, in any form, or by any means. Reverse engineering, disassembly, or decompilation of this software, unless required by law for interoperability, is prohibited. The information contained herein is subject to change without notice and is not warranted to be error-free. If you find any errors, please report them to us in writing. If this software or related documentation is delivered to the U.S. Government or anyone licensing it on behalf of the U.S. Government, the following notice is applicable: U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such, the use, duplication, disclosure, modification, and adaptation shall be subject to the restrictions and license terms set forth in the applicable Government contract, and, to the extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License (December 2007).
    [Show full text]
  • Using ETL, EAI, and EII Tools to Create an Integrated Enterprise
    Data Integration: Using ETL, EAI, and EII Tools to Create an Integrated Enterprise Colin White Founder, BI Research TDWI Webcast October 2005 TDWI Data Integration Study Copyright © BI Research 2005 2 Data Integration: Barrier to Application Development Copyright © BI Research 2005 3 Top Three Data Integration Inhibitors Copyright © BI Research 2005 4 Staffing and Budget for Data Integration Copyright © BI Research 2005 5 Data Integration: A Definition A framework of applications, products, techniques and technologies for providing a unified and consistent view of enterprise-wide business data Copyright © BI Research 2005 6 Enterprise Business Data Copyright © BI Research 2005 7 Data Integration Architecture Source Target Data integration Master data applications Business domain dispersed management (MDM) MDM applications integrated internal data & external Data integration techniques data Data Data Data propagation consolidation federation Changed data Data transformation (restructure, capture (CDC) cleanse, reconcile, aggregate) Data integration technologies Enterprise data Extract transformation Enterprise content replication (EDR) load (ETL) management (ECM) Enterprise application Right-time ETL Enterprise information integration (EAI) (RT-ETL) integration (EII) Web services (services-oriented architecture, SOA) Data integration management Data quality Metadata Systems management management management Copyright © BI Research 2005 8 Data Integration Techniques and Technologies Data Consolidation centralized data Extract, transformation
    [Show full text]
  • Data Profiling and Data Cleansing Introduction
    Data Profiling and Data Cleansing Introduction 9.4.2013 Felix Naumann Overview 2 ■ Introduction to research group ■ Lecture organisation ■ (Big) data □ Data sources □ Profiling □ Cleansing ■ Overview of semester Felix Naumann | Profiling & Cleansing | Summer 2013 Information Systems Team 3 DFG IBM Prof. Felix Naumann Arvid Heise Katrin Heinrich project DuDe Dustin Lange Duplicate Detection Data Fusion Data Profiling project Stratosphere Entity Search Johannes Lorey Christoph Böhm Information Integration Data Scrubbing project GovWILD Data as a Service Data Cleansing Information Quality Web Data Linked Open Data RDF Data Mining Dependency Detection ETL Management Anja Jentzsch Service-Oriented Systems Entity Opinion Ziawasch Abedjan Recognition Mining Tobias Vogel Toni Grütze HPI Research School Dr. Gjergji Kasneci Zhe Zuo Maximilian Jenders Felix Naumann | Profiling & Cleansing | Summer 2013 Other courses in this semester 4 Lectures ■ DBS I (Bachelor) ■ Data Profiling and Data Cleansing Seminars ■ Master: Large Scale Duplicate Detection ■ Master: Advanced Recommendation Techniques Bachelorproject ■ VIP 2.0: Celebrity Exploration Felix Naumann | Profiling & Cleansing | Summer 2013 Seminar: Advanced Recommendation Techniques 5 ■ Goal: Cross-platform recommendation for posts on the Web □ Given a post on a website, find relevant (i.e., similar) posts from other websites □ Analyze post, author, and website features □ Implement and compare different state-of-the-art recommendation techniques … … Calculate (,) (i.e., the similarity between posts and ) … Recommend top-k posts ? … Overview 7 ■ Introduction to research group ■ Lecture organization ■ (Big) data □ Data sources □ Profiling □ Cleansing ■ Overview of semester Felix Naumann | Profiling & Cleansing | Summer 2013 Dates and exercises 8 ■ Lectures ■ Exam □ Tuesdays 9:15 – 10:45 □ Oral exam, 30 minutes □ Probably first week after □ Thursdays 9:15 – 10:45 lectures ■ Exercises ■ Prerequisites □ In parallel □ To participate ■ First lecture ◊ Background in □ 9.4.2013 databases (e.g.
    [Show full text]
  • Data Profiling: Designing the Blueprint for Improved Data Quality Brett Dorr, Dataflux Corporation, Cary, NC Pat Herbert, Dataflux Corporation, Cary, NC
    SUGI 30 Data Warehousing, Management and Quality Paper 102-30 Data Profiling: Designing the Blueprint for Improved Data Quality Brett Dorr, DataFlux Corporation, Cary, NC Pat Herbert, DataFlux Corporation, Cary, NC ABSTRACT Many business and IT managers face the same problem: the data that serves as the foundation for their business applications (including customer relationship management (CRM) programs, enterprise resource planning (ERP) tools, and data warehouses) is inconsistent, inaccurate, and unreliable. Data profiling is the solution to this problem and, as such, is a fundamental step that should begin every data-driven initiative. This paper explores how data profiling can help determine the structure and completeness of data and, ultimately, improve data quality. The paper also covers the types of analysis that data profiling can provide as well as how data profiling fits into an overall data management strategy. INTRODUCTION Organizations around the world are looking for ways to turn data into a strategic asset. However, before data can be used as the foundation for high-level business intelligence efforts, an organization must address the quality problems that are endemic to the data that’s available on customers, products, inventory, assets, or finances. The most effective way to achieve consistent, accurate, and reliable data is to begin with data profiling. Data profiling involves using a tool that automates the discovery process. Ideally, this automation will help uncover the characteristics of the data and the relationships between data sources before any data-driven initiatives (such as data warehousing or enterprise application implementations) are executed. THE CASE FOR DATA PROFILING Not so long ago, the way to become a market leader was to have the right product at the right time.
    [Show full text]
  • Rapid Data Quality Assessment Using Data Profiling
    Rapid Data Quality Assessment Using Data Profiling David Loshin Knowledge Integrity, Inc. www.knowledge-integrity.com © 2010 Knowledge Integrity, Inc. 1 www.knowledge-integrity.com (301)754-6350 David Loshin, Knowledge Integrity Inc. David Loshin, president of Knowledge Integrity, Inc, (www.knowledge-integrity.com), is a recognized thought leader and expert consultant in the areas of data governance, data quality methods, tools, and techniques, master data management, and business intelligence. David is a prolific author regarding BI best practices, either via the expert channel at www.b-eye-network.com, “Ask The Expert” at Searchdatamanagement.techtarget.com, as well as numerous books on BI and data quality. His most recent book, “Master Data Management,” has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at www.mdmbook.com. David can be reached at [email protected]. MDM Component Model © 2010 Knowledge Integrity, Inc. 2 www.knowledge-integrity.com (301)754-6350 1 Business-Driven Information Requirements Driver Benefit Information Requirement Increased revenue, increased share, cross- Unified master customer data, Customer sell/up-sell, segmentation, targeting, retention, matching/linkage, centralized analytics, Intelligence customer satisfaction, ease of doing business quality data, eliminate redundancy Compliance, privacy, risk management, accurate Data quality, semantic consistency Risk & response to audits, prevent fraud across business processes, consistency, Compliance
    [Show full text]
  • Data Quality Fundamentals
    Data Quality Fundamentals David Loshin Knowledge Integrity, Inc. www.knowledge-integrity.com © 2010 Knowledge Integrity, Inc. 1 www.knowledge-integrity.com (301)754-6350 Agenda The Data Quality Program Data Quality Assessment Using Data Quality Tools Data Quality Inspection, Monitoring, and Control © 2010 Knowledge Integrity, Inc. 2 www.knowledge-integrity.com (301)754-6350 1 THE DATA QUALITY PROGRAM © 2010 Knowledge Integrity, Inc. 3 www.knowledge-integrity.com (301)754-6350 Data Quality Challenges Consumer data validation of supplied data provides little value unless supplier has an incentive to improve its product Data errors introduced within the enterprise drain resources for scrap and rework, yet the remediation process seldom results in long-term improvements Reacting to data integrity issues by cleansing the data does not improve productivity or operational efficiency Ambiguous data definitions and lack of data standards prevents most effective use of centralized “source of truth” and limits automation of workflow Proper data and application techniques must be employed to ensure ability to respond to business opportunities Centralization of integrated reference data opens up possibilities for reuse, both of the data and the process © 2010 Knowledge Integrity, Inc. 4 www.knowledge-integrity.com (301)754-6350 2 Addressing the Problem To effectively ultimately address data quality, we must be able to manage the Identification of customer data quality expectations Definition of contextual metrics Assessment of levels of data quality Track issues for process management Determination of best opportunities for improvement Elimination of the sources of problems Continuous measurement of improvement against baseline © 2010 Knowledge Integrity, Inc. 5 www.knowledge-integrity.com (301)754-6350 Data Quality Framework Data quality Measurement Policies Procedures expectations Governance Standards Monitor Training Performance © 2010 Knowledge Integrity, Inc.
    [Show full text]
  • Creating the Golden Record
    CreatingCreating thethe GoldenGolden RecordRecord BetterBetter DataData throughthrough ChemistryChemistry Donald J. Soulsby metaWright.com AgendaAgenda • The Golden Record • Master Data • Discovery • Integration • Quality • Master Data Strategy DAMADAMA –– LinkedInLinkedIn GroupGroup C. Lwanga Yonke - Information Quality Practitioner ......SpewakSpewak advocatedadvocated usingusing datadata dependencydependency toto determinedetermine thethe idealideal sequencesequence inin whichwhich applicationsapplications shouldshould bebe developeddeveloped andand implemented:implemented: “Develop“Develop thethe applicationsapplications thatthat createcreate datadata beforebefore thosethose thatthat needneed toto useuse thatthat data”data” (p.10).(p.10). ArchitectureArchitecture AdvocatesAdvocates WilliamWilliam SmithSmith – Entity Lifecycle CliveClive FinkelsteinFinkelstein - Information Engineering – CRUD RonRon RossRoss -- ResourceResource LifeLife CycleCycle AnalysisAnalysis CRUDCRUD inin aa PerfectPerfect WorldWorld CanonicalCanonical SynthesisSynthesis Broadly speaking, materials scientists investigate two types of phenomena. Both are based on the microstructures of materials: … ii. How do these microstructures influence the properties of the material (such as strength, electrical conductivity, or high frequency electromagnetic absorption)? http://www.its.caltech.edu/~matsci/WhatIs2.html Business VS Development Life Cycles Zachman Framework for Enterprise Architecture WHERE WHAT WHEN HOW WHO WHY CONTEXTUAL List of List of List of List of List
    [Show full text]
  • Unifying the Practices of Data Profiling, Integration, and Quality (Dpiq) by Philip Russom Senior Manager, TDWI Research the Data Warehousing Institute
    TDWI MONOGRAPH SERIES OCTOBER 2007 Unifying the Practices of Data Profiling, Integration, and Quality (dPIQ) By Philip Russom Senior Manager, TDWI Research The Data Warehousing Institute SPONSORED BY TDWI Monograph Unifying the Practices of Data Profiling, Integration, and Quality (dPIQ) Table of Contents Defining dPIQ...................................................................................................................................3 Cycles and Dependencies in Data Profiling, Integration, and Quality..............................................5 The Unified dPIQ Cycle ...................................................................................................................8 Recommendations...........................................................................................................................12 About the Author PHILIP RUSSOM is the senior manager of TDWI Research for TDWI, where he oversees many of TDWI’s research-oriented publications, services, awards, and events. Prior to joining TDWI in 2005, Russom was an industry analyst covering BI at Forrester Research, Giga Information Group, and Hurwitz Group. He’s also run his own business as an independent industry analyst and BI consultant and was contributing editor with Intelligent Enterprise and DM Review magazines. Before that, Russom worked in technical and marketing positions for various database vendors. You can reach him at [email protected]. About Our Sponsor DataFlux enables organizations to analyze, improve, and control their data through
    [Show full text]
  • Data Profiling Model for Assessing the Quality Traits of Master Data Management
    International Journal of Recent Technology and Engineering (IJRTE) ISSN: 2277-3878, Volume-8 Issue-6, March 2020 Data Profiling Model for Assessing the Quality Traits of Master Data Management Dilbag Singh, Dupinder Kaur Abstract: Enterprise Resource Planning (ERP) and Business Thus it is crucial to addresses these critical data entities Intelligence (BI) system demand progressive rules for (golden record) also known as master data, which is a single maintaining the valuable information about customers, truth version of record among different copies that belongs products, suppliers and vendors as data captured through to same person having different values. Master data is the different sources may not be of high quality due to human complete record of a human, like the data is about errors, in many cases. The problem encounters when this information is accessible across multiple systems, within same customers, patients, suppliers, partners, employees, and organization. Providing adequacy to this scattered data is a top other critical entities. For this purpose, a high quality and agenda for any organization as maintaining the data is consistent copy of master data becomes a primary task for complicated, as having high quality data. Master Data any organization. The processes and systems which Management (MDM) provides a solution to these problems by maintain this data are known as Master Data Management maintaining “a single reference of truth” with authoritative [5]. source of master data (Customer, products, employees etc). Master Data Management (MDM) is a holistic framework Master Data Management (MDM) is a highlighted concern now consisting of processes, tools and technologies which a day as valid data is the demand for strategic, tactical and provides coordinated master data in the enterprise.
    [Show full text]
  • Using Data Profiling, Data Quality, and Data Monitoring to Improve Enterprise Information
    SOFTWARE METRICS AND ANALYSIS As the amount of data an enterprise man- ages continues to grow, an organization must contend with two unique challenges: Using Data the sheer volume of that data and the multiple methods used to standardize this information across systems. These challenges often lead to a disparate, disconnected infor- Profiling, mation technology (IT) environment, where the information in any system can oper- ate in an isolated silo, creating multiple “versions of the truth.” In this article, the Data Quality, authors examine how one organization met its data management challenges by analyz- ing the data it receives, creating repeatable routines to improve the quality of that and Data data, and implementing controls to maintain high levels of data integrity. Through this combination of process improvement and technology, organizations can establish a Monitoring more team-based view of data that spans the IT and line-of-business environments—and create a quality-focused data management to Improve culture. Key words best practices, case study, data governance, Enterprise data management, data profiling, data quality Information BRETT DORR AND RICH MURNANE DataFlux Corporation INTRODUCTION Many companies have come to realize the substantial impact underlying data can have on every aspect of an organization. For the past 20 or 30 years, companies have purchased billions of dollars worth of applications designed to maximize the data they have in hopes of providing a data management structure for that information on an ongoing basis. These business applications are often designed to augment or administrate a business process or processes (for example, a customer relationship management (CRM) system manages sales and marketing activities).
    [Show full text]
  • Informatica Underpins Master Data Management from Data Quality Through to Enterprise Data Integration
    BROCHURE Informatica Underpins Master Data Management from Data Quality Through to Enterprise Data Integration More stringent regulation and market competition mean that organizations need to establish better control and consistency of their master data and the business processes responsible for the capture and maintenance of such shared data. Many organizations are implementing master data management solutions – under a number of different labels, including MDM, customer data integration (CDI), product information management (PIM), or global supplier management (GSM) – to control the defi nition and management of master reference data for shared business entities, such as customer, product, and fi nancial data. There are many approaches to master data management – from custom-built solutions to the numerous MDM packaged applications available in the market. The number of vendor offerings and BENEFITS: custom-solution approaches can cause confusion because of the wide spectrum of capabilities • Robust data integration and data delivered. Most of these approaches intersect, however, at the point where they need to plan, quality platform to underpin MDM underpin, and deploy the fi nal MDM solution. At this point, all of them require the kind of cross- platform data access, delivery, and data quality technologies provided by the Informatica platform. • Upfront data profi ling of all datatypes to ensure data alignment Informatica underpins both MDM packaged applications and custom-built solutions, in four key areas: Provides upfront data
    [Show full text]