<<

DATE: 7/23/21

Data Integration Developer

Position Description Title: Integration Developer Department: 520 - MIS Immediate Supervisor: Director – Application Development Status: Exempt Position Purpose: The Developer is responsible for helping implement processes required of a data processing pipeline in support of our products and services. The Data Integration Developer’s primary responsibilities include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance. The Data Integration Developer is experienced in , data exploration, data manipulation, and reporting. The ideal candidate has three or more years’ experience working on solutions that collect, process, store and analyze data.

Principle Accountabilities 1. Design and develop and Data Mart solutions to support business reporting requirements of different stakeholders across multiple business verticals including Finance, Marketing, Logistics, Product etc. 2. Conceptualize, design, and development of data models, entities, and relationships 3. Adhere to processes and design best practices to ensure data extractions meet quality standards and are enhanced for analytical purposes to support “single source of truth” 4. Responsible for the 24/7 support and maintenance of the Enterprise Data Warehouse. 5. Create and propose technical design documentation which includes current and future ETL development functionality, objects affected, specifications, and flows/diagrams to detail the proposed implementation. 6. Design, develop, test, tune, and implement procedural database code using ETL tools and Microsoft SQL Server. 7. Related tasks as requested by supervisor

Essential Skills and Experience • Experience working with complex applications across various data sources and platforms. • Experience using data to solve data challenges and questions. • Experience designing solutions leveraging a diverse assortment of data sources. • Experience with tools to perform ETL such as SQL Server Integration Services, Databricks, or Spark. • Experience with scripting languages, including PowerShell, Python, and SQL. • Understanding of common database technologies, such as SQL Database/Server, SQL Data Warehouse, Oracle, MySQL, and other data sources, such as Azure Storage and Azure Blob Storage. • Understanding of data governance and creating data dictionaries. • Preferred Exposure to cloud data technologies (EC2, S3, Glue, , ETL pipeline, IICS, Mulesoft) • Cloud solution: familiarity with SaaS, PaaS, IaaS, Cloud Storage experience preferred • Understanding of Enterprise Data Integration tool such as Informatica, Datastage, Talend, preferred

PAGE 1 OF 1