Indeed Resume

Indeed Resume

Aditya Dhavala Hadoop Developer - Sears Holding - Email me on Indeed: indeed.com/r/Aditya-Dhavala/4a1ec64204eb4bd3 • Around 7 years of IT experience as a Developer, Designer & quality reviewer with cross platform integration experience using Hadoop, Java, J2EE and SOA. • Hands on experience in installing, configuring and using Apache Hadoop ecosystems such as Map Reduce, HIVE, PIG, SQOOP, FLUME and OOZIE. • Hands on experience on Hortonworks and Cloudera Hadoop environments. • Strong understanding of Hadoop daemons and Map-Reduce concepts. • Experienced in importing-exporting data into HDFS format. • Experienced in analyzing big data using Hadoop environment. • Experienced in handling Hadoop Ecosystem Projects such as Hive, Pig and Sqoop. • Experienced in developing UDFs for Hive using Java. • Strong understanding of NoSQL databases like HBase, MongoDB & Cassandra. • Familiar with handling complex data processing jobs using Cascading. • Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, Oozie, Flume and Hbase). • Extensive experience in design, development and support Model View Controller using Struts and Spring framework. • Develop reusable solution to maintain proper coding standard across different java project. • Proficiency with the application servers like WebSphere, WebLogic, JBOSS and Tomcat • Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts, Hibernate, JAX-WS Web Services, and JMS. • Expertise in debugging and optimizing Oracle and java performance tuning with strong knowledge in Oracle 11g and SQL • Ability to work effectively in cross-functional team environments and experience of providing training to business users. WORK EXPERIENCE Hadoop Developer Sears Holding - Hoffman Estates, IL - February 2013 to Present Sears Holdings Corporation is a leading integrated retailer with almost 2,500 full-line and specialty retail stores in the United States and Canada. Sears Holdings is the leading home appliance retailer as well as a leader in tools, lawn and garden, fitness equipment and automotive repair and maintenance. I was a part of Big Data Processing Team to take advantage of available user data to make better decisions that significantly enhanced organizational success. I was involvedin setting up Cloudera Hadoop Cluster and wrote MapReduce jobs, Hive queriesand PigLatin scripts to explore through the data of customer sales to find significant information for trend analysis. Responsibilities: • Involved in start to end process of hadoop cluster installation, configuration and monitoring. • Responsible for building scalable distributed data solutions using Hadoop • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster • Setup and benchmarked Hadoop/HBase clusters for internal use • Developed Simple to complex Map/reduce Jobs using Hive and Pig • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior • Used UDF's to implement business logic in Hadoop • Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources. • Continuous monitoring and managing the Hadoop cluster using Cloudera Manager • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required • Installed Oozieworkflow engine to run multiple Hive and Pig jobs • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie, Java (jdk 1.6), Eclipse Hadoop Developer American Express - Phoenix, AZ - February 2012 to January 2013 American Express provides innovative payment, travel and expense management solutions for individuals and businesses of all sizes. It helps customers realize their dreams and aspirations through industry-leading benefits, access to unique experiences, business-building insights, and global customer care. Purpose of the project is to create Enterprise Data Hub so that various business units and use the date from Hadoop to do Data Analytics.The solution is based on the Cloudera Hadoop. The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Responsibilities: • Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for data cleansing and preprocessing. • Involved in loading data from UNIX file system to HDFS. • Installed and configured Hive and also written Hive UDFs. • Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs. • Devised procedures that solve complex business problems with due considerations for hardware/software capacity and limitations, operating times and desired results. • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it. • Provided quick response to ad hoc internal and external client requests for data and experienced in creating ad hoc reports. • Responsible for building scalable distributed data solutions using Hadoop. • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files. • Worked hands on with ETL process. • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS. • Extracted the data from Teradata into HDFS using Sqoop. • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping enthusiasts, travelers, music lovers etc. • Exported the patterns analyzed back into Teradata using Sqoop. • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager. • Installed Oozie workflow engine to run multiple Hive. • Developed Hive queries to process the data and generate the data cubes for visualizing. Environment: Hadoop, MapReduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g, PL SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting. Java/J2EE Application Developer Columbia Bank - Lakewood, WA - August 2010 to January 2012 The Columbia Bank Card Verification system is assessment system for credit card application. It processes the application of the customer who wants a credit card and processes till it gets either accepted or rejected. Responsibilities: • Involved in Java, J2EE, struts, web services and Hibernate in a fast paced development environment. • Followed agile methodology, interacted directly with the client provide/take feedback on the features, suggest/ implement optimal solutions, and tailor application to customer needs. • Rich experiences of database design and hands on experience of large database systems: Oracle 8i and Oracle 9i. • Involved in design and implementation of web tier using Servlets and JSP. • Used Apache POI for Excel files reading. • Written build scripts with Ant for deploying war and ear applications. • Developed the user interface using JSP and Java Script to view all online trading transactions. • Designed and developed Data Access Objects (DAO) to access the database. • Used DAO Factory and value object design patterns to organize and integrate the JAVA Objects • Coded Java Server Pages for the Dynamic front end content that use Servlets and EJBs. • Coded HTML pages using CSS for static content generation with JavaScript for validations. • Used JDBC API to connect to the database and carry out database operations. • Used JSP and JSTL Tag Libraries for developing User Interface components. • Performing Code Reviews. • Performed unit testing, system testing and integration testing. • Involved in building and deployment of application in Linux environment. • Deploying application in Development and Production servers. Environment: Java, J2EE, JDBC, Struts, SQL language. Hibernate, Eclipse, Apache POI, CSS. Java/J2EE Developer Capital One Bank - McLean, VA - October 2009 to June 2010 CapitalOne Auto Finance (COAF) is project where we design an application which can be used by the CapitalOne Bank to deal with different types of Auto loans depending upon the Customers Eligibility. COAF includes the maintaining of the existing Application. Responsibilities: • Played the role of Java developer in the project called "Coverage Selection Tool". • Technologies involved are EJB 3.0, Web services, Dojo (UI Framework) and other J2EE server components. • Analyze and prepare technical specifications with UML diagrams (Use case, Class, and Sequence diagrams • Used Rational Rose to develop the components required by client. • Wrote complex logic for forecasting the price of the products and subparts in next future quarters. • Development of business components applying OOAD and using good design patterns like, DAO, Value Objects, DTO, Factory, singleton. • Implemented DOM parsing module and created XSD and XSLT components. • Used stored procedures and Triggers extensively to develop the Backend business logic in Oracle database. • Involved in performance improving and bug fixing. • Analyze old database table fields and map to new schema tables using complex SQL Queries

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us