<<

PTS

Data Migration

1 Contents 2 Background ...... 2 3 Challenge ...... 3 3.1 Legacy Extraction ...... 3 3.2 ...... 3 3.3 Data Linking ...... 3 3.4 Data Mapping ...... 3 3.5 New Data Preparation & Load ...... 3 3.6 Legacy ...... 4 4 Solution ...... 5 4.1 Legacy ...... 5 4.2 Data Cleansing...... 5 4.3 Data Linking ...... 6 4.4 Data Mapping ...... 6 4.5 New Data Preparation & Load ...... 6 4.6 Legacy Data Retention ...... 6 5 ETLTOOL – Process Cycle ...... 7

Page 1 of 7

PTS

Data Migration

2 Background Data migration is quite often one of the biggest issues IT projects face. Ensuring the accurate and timely movement of data between multiple systems while at the same time ensuring business continuity does not come without risk. The process can be further complicated due to different formats, multiple sources and in most cases, the sheer volume of data to be moved.

The drivers for data migration are numerous and not always related to a software upgrade or new system implementation.

PTS have worked with a range of businesses with varying data migration requirements including:

 Merging systems as a result of business mergers or acquisitions and the need to combine multiple data sources into a single new application/master .  Software/system upgrades.  Moving or consolidating data between within incumbent or legacy systems.

PTS have over 40 years combined technical experience and are engaged by businesses to provide client-side technical consultation and services including Data Migration. In relation to Data Migration the primary task for PTS is to work in conjunction with the software implementers to manage and perform the Data Migration processes.

Page 2 of 7

PTS

Data Migration

3 Challenge One of the primary challenges businesses face in new software implementation is the migration of data from existing legacy systems, often operating in different jurisdictions, into a new software system.

As is the case with implementing any new software system migrating the data from the legacy systems poses a real challenge for businesses primarily in the following areas:

 Legacy Data Extraction  Data Cleansing  Data Linking  Data Mapping  New Data Preparation & Load  Legacy Data Retention

3.1 Legacy Data Extraction Legacy data extraction to ensure that the required legacy table and field data can be extracted from the legacy systems. There may be limited in-house knowledge of the legacy systems data structure or how best to go about extracting data from the legacy system.

3.2 Data Cleansing Data cleansing to ensure that the legacy data is cleansed / changed from the legacy system format into a format that is understandable to the new system. The client often has little or no understanding of the data limitations or expectations in the new system. Understanding what type of data can cause issues in the new system is vital in producing clean useable data. Data containing “funny” characters e.g. leading zeros, “/” or “&” or “’” or “+” can cause adverse unexpected results when bulk loading data.

3.3 Data Linking Data linking to ensure that all relevant data for a linked / merged entity is taken from all legacy systems and correctly merged into a new single data entity. Data coming from multiple systems may need to be merged so it is vital to understand which common accounts or items are the primary, used for master data, and which is secondary.

3.4 Data Mapping Data mapping where required to map old legacy codes to new system codes.

3.5 New Data Preparation & Load New data preparation and load relates to the production of new data files in a format the new system understands and the loading of that data into the new system.

Page 3 of 7

PTS

Data Migration

3.6 Legacy Data Retention Undertaking the cleansing, linking and mapping data migration processes in an active production environment poses its own challenges.  Changes to legacy data in legacy system may impact day to day operations.  Inability to roll back once legacy data has been changed in legacy production system.  Loss of legacy data for future reference.

Page 4 of 7

PTS

Data Migration

4 Solution PTS provide a full and safe solution to mitigate the problems associated with data migration.

The solution is provided via the ETLTOOL data migration package which was designed and developed by PTS. The ETL in ETLTOOL stands for Extraction, Transformation and Load.

ETLTOOL provides functionality to cover the following key features:

 Legacy Data Extraction  Data Cleansing  Data Linking  Data Mapping  New Data Preparation & Load  Legacy Data Retention

A more detailed description of each area follows.

4.1 Legacy Data Extraction  PTS provide consultation and assistance in identifying the key data entities in the legacy systems that are required for the new system. o In short what data does the new system require and where would does that data come from.

 ETLTOOL provides direct connections to legacy systems and databases: o Connections to AWARDS OpenEdge / Progress DB’s o Connections to NETSUITE SQL DB’s via ODBC. o Connections to SAGE 50 via SDO (Sage Data Objects).

 ETLTOOL provides a separate independent database used to store: o Unchanged legacy data o Cleansed data o Linked data o Mapped data

4.2 Data Cleansing  ETLTOOL provides the ability to cleanse data without changing any of the data in the legacy production system thus mitigating any potential impacts to the day to day operations of the business. o Involves changing the key Item / Product identifiers e.g. Item Code o Involves changing Item / Product descriptions etc. o Involves changing key Business Partner identifiers e.g. BP Code, Addresses

Page 5 of 7

PTS

Data Migration

4.3 Data Linking  ETLTOOL provides the ability to link data to ensure that all relevant data for a linked data entity is taken from all legacy systems: o The same customer or supplier may exist in multiple legacy systems with different identifiers e.g. Customer Code in each system. ETLTOOL provides functionality to identify those types of data entities and link them together thus ensuring that linked data comes across to the new system as a single new data entity.

 All linked data is extracted from relevant legacy systems and combined /merged into a single data entity in the new system. o Customers o Suppliers o Addresses o Items

4.4 Data Mapping  Data mapping may be required in some instances: o To ensure legacy Vat Codes are mapped to new Vat Codes etc. o To ensure Payment Terms are mapped to new Payment Terms etc.

4.5 New Data Preparation & Load  ETLTOOL provides the ability to produce data load files containing the newly cleansed, linked and mapped data, in a format expected by the new system: o Data files required for loading via SAP B1 DTW (Data Transfer Workbench). o Data files required for loading data directly into SAP B1.  PTS also manages and controls the loading of new data into the new system: o Loading of required SAP B1 data via DTW. o Loading of required SAP B1 data via SAP B1.

4.6 Legacy Data Retention  Using ETLTOOL to manage the data migration completely mitigates the risk to legacy data in the production system. o There are no changes to data in legacy system thus mitigating the risk of impact to day to day operations. o No need to ever roll back the data in the legacy production system as it is never changed. o Full retention of legacy data in its existing state.

Page 6 of 7

PTS

Data Migration

5 ETLTOOL – Data Migration Process Cycle The following is a brief synopsis of the data migration process cycle.

 Identify the legacy system source of the required data entities o Could be Customer, Supplier, Item, Opening Balances etc.  Produce data cleansing sheets from ETLTOOL o These are circulated to the key stakeholders responsible for cleansing.  Track data cleansing sheets. o Who has what sheet and when is the sheet due back.  Load cleansed data into ETLTOOL. o The ETLTOOL will combine cleansed and non-cleansed data.  Produce linking / mapping sheets. o These are circulated to the key stakeholders responsible for linking / mapping if required.  Track linking / mapping data sheets o Who has what sheet and when is the sheet due back.  Load linking / mapping data into ETLTOOL. o The ETLTOOL will combine cleansed, non-cleansed linked & mapped data.  Produce data load files for populating the new system. o The ETLTOOL produces SAP B1 DTW files for data loaded via DTW o The ETLTOOL produces standard CSV or TXT files for data loaded directly into SAP B1.  Load files into the new system WIP (Work In Progress) DB. o Load the data files for SAP B1 via DTW into WIP. o Load the required data files directly into SAP B1 into WIP.  Test cycle #1 o WIP copied to a new UAT DB o There can be multiple UAT (User Acceptance Test) cycles usually running for a week.  Further data loads can be tested into a new WIP while UAT is ongoing o Once UAT has been created WIP is scrapped and re-started. o Fresh data is loaded into the new WIP  Test cycle #2 o WIP copied to a new UAT DB o There can be multiple UAT (User Acceptance Test) cycles usually running for a week.  Opening Balances sign off o The Customer & Supplier opening balances can be checked and verified in WIP o The Stock opening balances can be checked and verified in WIP  WIP copied to LIVE o Once all opening balances have been checked and verified the WIP is then copied over to a new LIVE database. o The Customer, Supplier and Stock opening balances can the re-checked and signed off in LIVE.

Page 7 of 7