Extract, Transformation, and Load(Etl); and Online Analytical Processing(Olap) in Bi

Extract, Transformation, and Load(Etl); and Online Analytical Processing(Olap) in Bi

International Journal of Database Management Systems (IJDMS) Vol.12, No.3, June 2020 DESIGN, IMPLEMENTATION, AND ASSESSMENT OF INNOVATIVE DATA WAREHOUSING; EXTRACT, TRANSFORMATION, AND LOAD(ETL); AND ONLINE ANALYTICAL PROCESSING(OLAP) IN BI Ramesh Venkatakrishnan Final Year Doctoral Student, Colorado Technical University, Colorado, USA ABSTRACT The effectiveness of a Business Intelligence System is hugely dependent on these three fundamental components, 1) Data Acquisition (ETL), 2) Data Storage (Data Warehouse), and 3) Data Analytics (OLAP). The predominant challenges with these fundamental components are Data Volume, Data Variety, Data Integration, Complex Analytics, Constant Business changes, Lack of skill sets, Compliance, Security, Data Quality, and Computing requirements. There is no comprehensive documentation that talks about guidelines for ETL, Data Warehouse and OLAP to include the recent trends such as Data Latency (to provide real-time data), BI flexibility (to accommodate changes with the explosion of data) and Self- Service BI. This research paper attempts to fill this gap by analyzing existing scholarly articles in the last three to five years to compile guidelines for effective design, implementation, and assessment of DW, ETL, and OLAP in BI. KEYWORDS Business Intelligence, ETL, DW, OLAP, design implementation and assessment 1. INTRODUCTION “Business intelligence (BI) is an umbrella term that includes the applications, infrastructure and tools, and best practices that enable access to and analysis of information to improve and optimize decisions and performance” [1]. Data Acquisition, Data Storage, and Data Analytics are the primary components of a BI system, and for it to be successful, practical design and implementation of these three components are critical. Source systems such as OLTP records business events. ETL stands for Extract, Transact, Load. It is a set of tools and processes to extract the data from source systems to load into a data warehouse after transformation. Data warehouse (DW) is the core component of a BI system where data from various sources are centrally stored. OLAP stands for Online Analytical Processing and is responsible for analyzing the data warehouse data to aggregate and present it in a multi-dimensional format to answer “forecasts.” The effectiveness of a BI System is hugely dependent on these three fundamental components Data Acquisition (ETL), Data Storage (Data Warehouse), and Data Analytics (OLAP). There is no comprehensive documentation that talks about guidelines for ETL, Data Warehouse and OLAP to include the recent trends such as Data Latency (to provide real-time data), BI flexibility (to accommodate changes with the explosion of data) and Self-Service BI. The purpose of the article is to analyze existing scholarly articles in the last three to five years to compile guidelines for effective design, implementation, and assessment of DW, ETL, and DOI : 10.5121/ijdms.2020.12301 1 International Journal of Database Management Systems (IJDMS) Vol.12, No.3, June 2020 OLAP in BI. The drivers for the need for this “updated” guidelines are agility, the next generation of data, cloud computing, real-time data, situation awareness, and self-service. The new design, implementation, and assessment guidelines of DW, ETL, and OLAP would help decision-makers and BI IT practitioners in proactively avoiding the “known” pitfalls. This approach also helps BI practitioners prepare their systems to be flexible to accommodate the data explosion, and to move from an IT-Lead BI to IT-enabled BI. This paper is organized into three main sections. The first section of this paper describes the challenges associated with all three fundamental components (Data Acquisition, Data Storage, and Data Analytics) of BI. The second section provides guidelines for design and implementation. The third section provides an assessment methodology for ongoing effectiveness. 2. RELATED WORK The limitation of traditional computing techniques is a significant limitation in dealing with large volumes of data [2]. Santos, Silva, and Belo (2014) highlighted the dealing of vast data sets in a short time slot for data warehouse systems as a challenge due to the need for massive computational resources and storage requirements [3]. Bousty, Krit, Elasikiri, Dani, Karimi, Bendaoud, and Kabrane (2018) have recommended continuous updates of server and storage processing capacities to tackle the new data requirement [4]. Their research also recommends utilizing cloud-based solutions to keep the processing capabilities elastic to reduce additional investments. Vo, Thomas, Cho, De, and Choi (2018) recommends three new features for the next generation BI, Operational BI (Near Real-time), Situational BI (Real-time), and Self Service BI ( reduced dependency of IT Staff) [5]. Santos et al. (2014), in their research, recommended a grid environment-based scheduling solution for small to medium ETL processing needs [3]. Extensive ETL processing continues to remain a challenge. Separation of processing for OLTP and OLAP, the use of specialized databases such as NoSQL, Columnar, In- Memory-DB, and Hybrid-DB appears to be common recommendations from most of the studies. 3. THE MAIN CHALLENGES FOR DATA ACQUISITION, DATA STORAGE, AND DATA ANALYSIS The main challenges for Data Acquisition, Data Storage, and Data Analysis are Data Volume, Data Variety, Data Integration, Complex Analytics, Constant Business changes, Skill sets, Compliance, Security, Data Quality, and Computing. The following paragraphs present to explain these challenges specific to all these three (ETL, DW, and OLAP) fundamental components. 3.1. ETL ETL specific challenges are in deciding relevant and non-relevant data for extract, minimizing the performance overhead on the source system, flexibility with the data conversion, data clean- up, and balancing the real-time and batch loading. The extraction process has the potential to cause performance impacts on the source system [6]. Source data systems typically are of different types. Standard formats are relational databases, flat files, non-relational databases (IMS, NoSQL). The extraction process should be able to read these diverse data types, and the transformation process should be able to apply complex rules in converting these to a single standard format for loading. 2 International Journal of Database Management Systems (IJDMS) Vol.12, No.3, June 2020 The loading process typically happens as incremental batch jobs. These batch jobs get scheduled with minimum intervals to facilitate near-real-time load or with no intervals to support real-time loading. Completion of one scheduled before the start of another schedule is critical and mandates the implementation of audit trails to store all data changes associated with the batch job. The space efficiency is nearly zero (Guo, Yuan, Sun, & Yue, 2015) for traditional ETL due to the redundant storage of the data for the staging phase. Frequent re-loading takes place along with the application logic changes causing more data redundancy [7]. Lack of a standard conceptual model to represent ETL processes is a real problem [8]. This problem makes ETL processes less flexible, and especially with every application logic change and the addition of new data sources, it results in a significant amount of work and reworks. 3.2. Data Warehouse Data Warehouse specific challenges are mainly related to storing a massive amount of data. Data integration, duplication, cleansing, and timing of the data often add the complexities in maintaining the data quality. Suboptimal decisions are often due to these data quality issues. Complexities with volume (emergence of IoT), velocity (real-time), and variety (disparate source systems) of data puts a significant dependency on data quality. Khoso (2016), has estimated the growth of the total amount of data in use to shoot up to 44 Zettabytes in 2020 [9]. If we take healthcare as an example, volume (electronic health records, patient data like biometrics, variety (doctor notes, images), and velocity (IoT, wearables) have to ensure data quality to help with patient health, detection of diseases and cost-efficiency. The capability of the DW system to support real-time and on-demand workloads are quintessential. However, availability and performance issues post a significant barrier to supporting real-time workload [10]. No simplified DW scalability solution exists to tackle the explosion of data growth. Reporting on compliance (like GDPR) and security needs a tremendous amount of changes in the data warehouse architecture. Complexities exist with the Query-response service level agreement (SLA) adherence for all sizes of data ranging from single-digit terabyte-sized database to petabyte sized database. 3.3. OLAP OLAP specific challenges include sophisticated analytics and different data types. The limitations of the underlying DBMS and compatibility put additional complexity to advanced analytics. Storing of analytic data inside the data warehouse or in a separate analytic database is a crucial design decision [10]. Computing requirements for advanced analytics are high, and the ability to accurately estimating the compute requirements is still a difficult task to answer. Variety and volume of the data make the data visualization and the dashboards to be highly dependent on IT staff. 4. DESIGN AND IMPLEMENTATION GUIDELINES FOR ETL, DW, AND OLAP The core of the design principle of ETL, DW, and OLAP is to maintain its life cycle as a continuous

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us