Shruthi

 Profile Summary:  An IT professional with 7+ years of experience working as a Informatica, Teradata Developer on mission critical software applications with a strong background in Data Warehousing with Oracle, Teradata and SQL Server database technologies.  Well versed with Data warehousing Architecture on the Design, Development, Extraction, Tran sformation and Loading of data using Informatica Power Center 8.6/9.1/9.5.1 ETL tool.  Experienced in Cognos 10, 8, ReportNet with Report Studio, Framework Manager, Query Studio, Analysis Studio, Metric Studio and Cognos Connection.  Well-versed in developing reporting applications using Cognos Business Intelligence component s like Report Studio, Framework Manager, Query Studio, Analysis Studio, Metric Studio and Co gnos Connection. Experienced in developing Dashboards Reports in Cognos 8/10.  Experience in all the phases of software (SDLC) and Data warehouse life cycle involving Require ment Analysis, Design, Coding, Testing, and Deployment. Familiar with Agile/Scrum.  Strong knowledge of Entity-Relationship, Facts and Dimensions tables, slowly changing dimens ions and Dimensional Modeling (Star Schema and Snow Flake Schema).  Extensive experience in developing mappings for Extraction, Transformation, Loading (ETL) dat a from various sources into Data Warehouse/Data Marts.  Expert on Slowly Changing Dimensions Type1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.  Experience in creating various Transformations including Joiner, Lookup, Sorter, Aggregator, Expression, Update Strategy, Router, Filter, Sequence Generator, Normalizer and Rank and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.  Extensively worked on developing and debugging Informatica reusable transformations, mappings, mapplets, sessions, workflows, worklets and identifying areas of optimizations  Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.  Extensively used SQL and PL/SQL in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.  Experience in working with large data warehouses, mapping and extracting data from legacy systems, Netezza/Sql Server 2008/Oracle/AS400/Db2 UDB databases.  Good knowledge in interacting with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ).  Extensively worked in planning the Metadata strategy and responsible for maintaining of the Metadata in the Enterprise Data Warehouse (EDW).  Proficient in the Integration of various data sources like Oracle, SQL Server, DB2, Teradata, Ms- Access, COBOL, Mainframe, Excel, XML, VSAM files and Flat Files.  Experience in managing Informatica release and providing 24/7 production support.  Skilled in implementing Data Warehousing and Business Intelligence solutions.  Expertise in Ralph Kimball and Bill Inmon’s Data Warehouse methodology.  Accomplished Data Warehouse Architecture, Data Modeler and Designer.  Experienced in bug tracking, bug reporting and bug documentation, proven skills developing, and maintaining test cases across a variety of environments and technologies.  Well organized, goal oriented, with excellent trouble shooting and problem solving skills.  Technical skills ETL InformaticaPowerCenter9.x/8.x, DataStage, SSIS Designing and Scheduling tools Erwin, Visio, Autosys, Control M Databases Oracle 11g/10g/9i, SQL Server, Teradata, XML, Sybase, Netezza Database Client tools SQL* PLUS, SQL Loader, Toad 7.5 Languages C, C++, HTML, SQL, PL/SQL, Visual Basic, Unix Shell Scripting, Java BI tools Cognos 8.x, Business Objects XI, MicroStrategy OS Windows 98/2000/NT/XP/8, UNIX, Perl Others MS Office (Word, Excel, PowerPoint), SharePoint

 Professional Experience

Client: ABBOTT LABS, IL July’13 – present Role: Sr. Informatica Developer

Responsibilities:

 Worked closely with the Business Analysts to understand the Data requirements, definitions, and business rules to be implemented.  Gathered business rules and requirements of different Lines of Business for the ETL process from various Development Centers in the Enterprise.  Installed and Configured Oracle Warehouse Builder.  Worked along the migration of Oracle to Teradata project where the existing Data Marts have been migrated from Oracle to Teradata environment  Developed BTEQ scripts & Designed data feeds to load into Teradata from Oracle  Implemented Teradata Utilities such as FastLoad, MultiLoad & FastExport and Teradata Parallel Transporter connections for loading data to enhance runtime and performance.  Involved in the creation of new objects (Tables/Views, Triggers, Indexes, Keys) in Teradata and modified the existing 700+ ETL’s to point to the appropriate environment.  Prepared the required application design documents based on functionality required.  Used Informatica as an ETL Data movement tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various platforms.  Developed mappings using flat files and relational databases as sources.  Developed reusable transformations and mapplets, and used them in mappings.  Used SQL override queries in source analyzer to customize mappings.  Used Informatica debugger to find troubleshooting information about data and error conditions in the mappings.  Developed expertise in BI Publisher development and support end user s efforts to use this reporting tool.  Provided internal consulting assistance to the Report writing teams.  Participate in the implementation of any OBIEE product upgrades.  Design and conduct unit tests of new or changed RPD objects and ETL mappings.  Creation of metadata for EDW Implementation.  Data Quality Analysis to determine cleansing requirements.  Developed several complex mappings in Informatica, variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter and IDQ.  Created UAT documentation and conducted presentations for the end users.  Responsible for creating mappings and transforming existing feeds into the new data structures maintaining client standards utilizing Normalizer, Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy transformation.  Lead Setup Informatica cloud environments and develop standards across installation and development of Informatica cloud tasks.  Extensive experience in designing and working with Oracle and writing SQL queries and PL/SQL Procedures, Functions, Packages, Database Triggers, Exception Handlers, Cursors, Database Objects using PL/SQL and possess good understanding of oracle data dictionary.  Implementation of Informatica cloud environments on Windows 2008 servers for various divisions.  Installed Apache for sharing files on Windows 2008 servers.  Established Coding standards and Naming Conventions to be followed for Informatica Cloud Development. Scheduled MDM Stage Jobs, Load jobs using Utilities workbench.  Informatica MDM processes including batch based and real-time processing.  Created oracle procedure and mappings using OWB 10gr2.  Designed the ETL Specification for STG and DW Areas.  Created Process flow, scheduled & monitored session.  Designed the LDM and KPI. Worked on tuning OWB mappings.  Handled UNIX operating system tasks to run informatica jobs using scheduling tools like Control M and also generating Pre and Post-Session UNIX Shell Scripts.  Supported migration of ETL code from development to QA and then from QA to production.  Generated Complex reports in Cognos 10.1 report studio including Drill Down reports from DMR modeled Frame work model.  Created Dashboards to present critical company data in a single report.  Responsible for regression testing ETL jobs before test to production migration.  Performed Database/ETL migrations from Dev. environment to Test/Training/UAT/Staging and PROD environments.  Provided support and quality validation through test cases for all stages of Unit and Integration testing. Created list reports & cross tab reports using cognos 10.1

Environment: Informatica Power Center 9.x, Informatica Cloud, Oracle 10g/11g, Teradata, Flat files, DW architectures, ER Studio 9.x,Visio, UNIX Shell Scripting, Informatica Multidomain MDM 9.1.0, Cognos 10.1, OBISE1, OWB 10gR2, Teradata, Informatica Data Quality, Windows-7 Professional.

Client: Xerox, Tallahassee, FL Aug’12 – June’13 Role: Sr. ETL Developer Responsibilities:  Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.  Created Logical and Physical models for Staging, Transition and Production Warehouses using Erwin 4.0. Integrated IDQ checks with Informatica Power Center.  Used Repository manager to create user groups and users, and managed users by setting up their privileges and profile  Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.  Created Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.  Developed mappings to download a XML file from a web server using HTTP Transformation  Worked on XML/XSD/XSLT as a part of Source XML files for Informatica and also input XML for Web service Call.  Development of test documents like Functional test cases, User Acceptance (UA) test cases.  Designing, Installing & Configuring core Informatica MDM Hub components: Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling. Involved in development and implementation of SSIS, SSAS and SSRS solutions.  Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts.  Created SSIS packages to extract data from OLTP and transformed to OLAP systems and Scheduled Jobs to call the packages and Stored Procedures.  Prepared Mapping documents, Data Migration documents, and other project related documents like mapping templates and VISIO diagrams.  Involved in data conversion, migration, integration quality profiling tasks.  Created Mapplet and used them in different Mappings. Manual unit testing of Mappings.  Used sorter transformation and newly changed dynamic lookup  Created events and tasks in the work flows using workflow manager  Interact with business units to understand their needs and design appropriate RPD objects  Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.  Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.  Worked on Teradata Utilities (MultiLoad, FastLoad, Export/Import) to improve performance.  Designed UNIX shell scripts to automate the BTEQ scripts for loading into Teradata database.  Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies. Created reports using oracle discoverer.  Implemented the ETL processes using Informatica Power Center to load data from sources system (SQL Server) into the DW (Oracle).  Developing Complex Transformations, Mapplets using Informatica to Extract, Transform and Load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS). Created Analytics and dash board analysis using oracle Discoverer.  Created Workflow, scheduled & monitored session using Informatica Workflow Manager.  Developed shell scripts for running batch jobs and scheduling them using Autosys.  Designed and developed the reports as per the requirements in Cognos 8.2.  Implemented security into Cognos Transformer Models using Macros. Environment: Informatica Power Center 9.1, Informatica MDM 9.5.1, Erwin 4.0, Oracle 10g/9i, SQL, Teradata, PL/SQL,TOAD, SQL * Loader, Sun Solaris 2.6, UNIX Shell Scripting, Autosys Scheduler, Cognos 8.2, Oracle 10g R2 Discoverer, SQL Integration Services (SSIS), SQL Reporting (SSRS)

Client: Next brick Solution Limited, CA Oct’10 – July’12 Role: Informatica Developer

Responsibilities:  Interacted with business community and gathered requirements based on changing needs.  Incorporated identified factors into Informatica mappings to build Data Warehouse.  Assisted in designing Logical/Physical Data Models using Erwin 4.0.  Developed mappings to extract data from Oracle, Flat files, XML files, and load into Teradata Data warehouse using the Mapping Designer.  Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.  Create MDM mappings using UPMC best practices to capture the errors and have a clean load using Informatica. Extensively used Teradata SQL with in BTEQ.  Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.  Defects were tracked, reviewed and analyzed.  Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.  Written complex SQL queries in Teradata for testing the Informatica mappings.  Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.  Prepared and maintain technical documentation for all new or modified RPD objects and ETL mappings. Executed Multi-Load scripts for daily batch jobs.  Developed PL/SQL code for new requirements, enhancements and debugging of procedures, packages and other Oracle objects.  Identified sources, targets, mappings and sessions and tuned them to improve performance.  Worked extensively with utilities of Teradata like Multi-Load, T-Pump and BTEQ to populate the target data.  Extensively worked in data Extraction, Transformation and loading from Xml files, large volume data and Adobe PDF files to EDW using B2B data transformation and B2B Data exchange.  Collected the Stats in Teradata for all the queries after every refresh of the DB for improving the query performance. Prepared UAT Document to perform user acceptance test.  Used Informatica Designer to create complex mappings using different transformations like Fil- ter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.  Scheduled Informatica jobs through Scheduler for UAT and Production and UAT loads.  Actively participated in data base testing like checking the constraints, correctness of the data, stored procedures, field size validation, etc  Experience in creating Jobs, Alerts, SQL Server Mail Agent, and schedule DTS/SSIS Packages.  Developed SSIS packages for File Transfer from one location to the other using FTP task  Developed DTS packages to copy tables, schemas and views and to extract data from Excel and Oracle using SSIS. Involved in scheduling the workflows through process scheduler.  Used command line mode to embed pmcmd commands in to shell scripts and used pmcmd, pmrep commands in Interactive mode to access repository and workflows.  Created tables, synonyms, views (particularly with analytic functions), and sequences in Netezza.  Worked closely with the business analyst’s team in order to solve the Problem Tickets, Service Requests by giving 24/7 Production Support.

Environment: Informatica MDM 9.5.1, Informatica Power Exchange, Netezza , PL/SQL, Teradata v12, Flat files, Linux, Erwin, TWS Maestro, SQL Integration Services (SSIS), SQL Reporting (SSRS)

Client: GE Healthcare, Glen Allen, VA Mar’09 – Sept’10 Role: Informatica Developer

Responsibilities:  Instrumental in Analysis, Requirements Gathering and documentation of Functional & Technical specifications.  Involved in Dimensional modeling to Design and develop STAR Schemas using ERWIN to identify Fact and Dimension Tables.  Worked extensively on Informatica client tools such as Designer, Workflow manager, Workflow Monitor.  Based on the business requirements Reusable transformations are created in transformation developer and Mapplets in the Mapplet designer.  Developed complex mappings to transform the data using Rank, Sorter, Stored Procedure, Joiner, Aggregator, Filter, Connected lookup, unconnected lookup and Router transformations.  Implemented slowly changing dimensions type 2 to keep track of historical data.  Extensively used dynamic lookup cache for slowly changing dimensions.  Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.  Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions.  Analytics on Big Data using Netezza (for structured data).  Extensively used TOAD and PL/SQL Developer for daily development jobs.  Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and resolved them. Extensively performed unit testing and system testing.  Performed Database end tuning using EXPLAIN PLAN and analyze table queries.

Environment: Informatica Power Exchange, Netezza, DB2, SQL Server 2008, PL/SQL, Teradata, Toad, ERwin 3.5, Linux, IBM Data Studio. Client: Medica, Minneapolis, MN Aug’07 – Feb’09 Role: ETL Developer

Responsibilities:  Understanding the customer requirements and analyzing or resolving the discrepancies in the business requirements.  Worked to design the proper architecture which can isolate the application component (business context) of the data integration solution from the technology, it also helps for reuse - reuse of skills, design objects, and knowledge.  Prepared the Design documents as per the requirements of the Business Analysts and Data model  Extensively used Informatica to load data from flat files, XML files and relational tables  Developed Power Exchange data maps that were used to pull the data from files on the Mainframe / VSAM files.  Developed Shell Scripts for retrieving files from FTP server, achieving the source files, Concatenating files and finally to deliver them to remote shared drive  Developed the error Logic for streamlining, automating the data loads for cleansing incorrect data and developed Auditing mechanism to maintain the load statistics of transactional records  Designed and developed standard load strategies to load data from source systems to Actimise Database which is the final target system.  Prepared documentation on the design, development, implementation, daily loads and process flow of the mappings and participating in review design documents  Validated the table design, records populated in the DB and used to test the data to check whether it is correctly loaded into the required schemas as per the business requirements  Extensively involved in performance tuning using various components in the mappings, sessions or database tables and used Parameter files, Variables, cache mechanism and using SQL overrides.  Designed and managed a High Availability database infrastructure providing scalability and reliability with multi-site replication architecture and DR strategy to comply with firm-wide critical application.  Utilized Power Designer to create logical and physical data structures based on a Metadata driven architecture. The data is sourced from external vendors and is supplemented with information from internal settlement and trading systems.  Introduced Informatics for building end of day extracts for asset classes to help meet SLAs for downstream users.

Environment: Informatica Power Center 8.1, Power Exchange Navigator v.8.1, Mainframe source system, DB2, AIX OS as platform and CA7 Scheduling tool.

Client: Reddy Labs, Hyderabad, India July’06 – June’07 Role: ETL Developer

Responsibilities:  Requirement Analysis and Design  Coding (involves coding of shell scripts, Informatica mapping, sessions and workflows)  Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.  Developed Shell Scripts, PL/SQL stored procedures, table and Index creation scripts.  Created complex transformations using Source Qualifier, Joiner, Aggregator, Lookup, Router, Expression, Update Strategy and Sequence Generator.  Used Informatica Power Center Server manager to create sessions, batches to run with the logic embedded in the mappings.  Involved in developing the SQL scripts for Extract and load the data warehouse.  Involved in Unit and Integration testing of Informatica Sessions and Target Data.

Environment: Informatica PowerCenter 6.2, MS Access, SQL Server 2000, Oracle 8i/7i, PL/SQL, SQL *Loader, Windows 2000, UNIX.

 Educational Qualifications

 Bachelor in Computer Science, JNTU University, India.

Certifications:

 Certified professional on Introduction to Oracle 9i.  Certified Informatica developer 8.6.