abhishek resume

6
+91 9953507179 [email protected] Abhishek Gupta Contact - +91 9953507179 Email-id- [email protected] To leverage my knowledge of the Software Development in the IT Sector and play proactive role in the Software Management for providing customer with best IT solutions. Profile summary Over 6. 2 years IT experience which involves 4.9+ years of Data warehousing, ETL – Datastage 9.x, Sql, Unix, Oracle, Db2, Big Data, Hadoop, Pig, Hive, Sqoop, Hdfs, Yarn, Map Reduce and extensive experience on Mainframes Experienced as a ETL Developer and worked for Development, Enhancement and Maintenance of Application which involves analysis, design, Coding and testing using Various stages Like Transformer, sequential file, Aggregator, Xml, Funnel, joiner, Lookup etc Assisted in ETL Performance Monitoring and Tuning to analyze ETL workflows to identify performance bottlenecks by using the various stages and Partition methods. Assisted in Technical documentation, ETL maps, batches, and processes and involved in the requirement analysis phase Involved in creation of Mapping documents by regular interaction with client and other components to understand the Requirements for designing and development of Jobs Worked on Heath Care, Banking/financial function and Telecom Domain Projects. Experienced on major components in Hadoop Ecosystem like Yarn, Hdfs, Map Reduced, Sqoop, Pig and Hive Experienced in Import/Export data using Sqoop from HDFS to Relational Database Systems and vice-versa Technical Skills Data warehouse Tools : ETL – IBM Infosphere Datastage 9.x Big Data Skills : Hadoop Eco System, Hdfs ,Sqoop, Map Reduce , Hive, Pig

Upload: abhishek-gupta

Post on 16-Feb-2017

48 views

Category:

Documents


7 download

TRANSCRIPT

+91 9953507179 [email protected]

Abhishek Gupta

Contact - +91 9953507179 Email-id- [email protected]

To leverage my knowledge of the Software Development in the IT Sector and play proactive role in the Software Management for providing customer with best IT solutions.

Profile summary Over 6. 2 years IT experience which involves 4.9+ years of Data warehousing, ETL –

Datastage 9.x, Sql, Unix, Oracle, Db2, Big Data, Hadoop, Pig, Hive, Sqoop, Hdfs, Yarn, Map Reduce and extensive experience on Mainframes

Experienced as a ETL Developer and worked for Development, Enhancement and Maintenance of Application which involves analysis, design, Coding and testing using Various stages Like Transformer, sequential file, Aggregator, Xml, Funnel, joiner, Lookup etc Assisted in ETL Performance Monitoring and Tuning to analyze ETL workflows to

identify performance bottlenecks by using the various stages and Partition methods. Assisted in Technical documentation, ETL maps, batches, and processes and involved in

the requirement analysis phase Involved in creation of Mapping documents by regular interaction with client and other

components to understand the Requirements for designing and development of Jobs Worked on Heath Care, Banking/financial function and Telecom Domain Projects. Experienced on major components in Hadoop Ecosystem like Yarn, Hdfs, Map

Reduced, Sqoop, Pig and Hive Experienced in Import/Export data using Sqoop from HDFS to Relational Database

Systems and vice-versa

Technical Skills Data warehouse Tools : ETL – IBM Infosphere Datastage 9.x Big Data Skills : Hadoop Eco System, Hdfs ,Sqoop, Map Reduce , Hive, Pig Databases : Oracle,DB2, Teradata Scripting language : Unix, SQL Batch Scheduler : Tivoli workload scheduler (Tws) Desktop Applications : Citrix, WebEx on-line meeting, Lotus notes, Office2007 Programming Languages : Mainframe - Cobol, Jcl, VSAM, CICS Tools & Utilities : Quality center(QC), Remedy, Citrix

+91 9953507179 [email protected]

Experience Details Project Details # 1Company United Health Group(Apr -14 to till date)Projects Polaris/MDM – Cirrus Integration

Tools/Technologies IBM Datastage9.x, DB2, Sql, Unix, Tivoli, Altova Xml Spy, Hadoop, Pig, Hive, Sqoop

Description

Cirrus Integration is a HealthCare domain Project where we are fetching data from Multiple Sources and loading into a single repository called Master Data management (MDM). We are extracting all source of Information and generating Xml Files in a specified format by applying various business logics to provide to End User Cirrus. This Information can be share among multiple Applications inside UHG from a single Source. Also It will improve speed of business decision making through advanced analytics and to design and build a system which will provide a single source of truth for UHG.

Responsibilities

• Involved in Extracting, transforming, loading and testing data from Dataset, Flat files, Xml Files and Databases

• Developed/Modified Datastge jobs, Writing & optimizing SQL and developing Shell Scripts as per business requirements.

• Migrate the components like Datastage jobs, unix scripts and Sql scripts using CLM/SVN/manager server

• Involved in building Logic for Error Management and capture daily load status for ETL runs.

• Loaded data into HDFS and extracted the data from Db2 Database into HDFS

• Used Sqoop to import data from Db2 to hive tables• Used Pig scripting for data validation and transformation• Create and update hive schema, loading and querying data• Involved in Technical documentation, Mapping Documents,

batches, Unit Test Case Document and processes• Involved in daily meetings with On-Shore, Manager and

Team leads

Project Details # 2Company Atos India Pvt. Ltd. Project IBM –VODAFONE

Tools/Technologies IBM Datastage Information server v 8.7/8.1, Teradata, Oracle 11i, Cognos, Unix Scripting, Tivoli

Description

Vodafone is the Telecom project which is the Integration of multiple processes. Each process is handled by a dedicated team which will maintain the daily database and handling the daily transaction to finally generate reports for Vodafone business.

+91 9953507179 [email protected]

Responsibilities

• Involved in Extracting, transforming, loading and testing data from Datasets , Flat files and Database stages

• Automated the ETL process using Tivoli scheduling tool.• Involved in building Logic for Error Management and

capture daily load status to Audit Tables for ETL runs.• Generated data and given to the Cognos team to develop

further reports in Cognos.• Day-to-day Cognos administration activities like monitoring

Scheduled jobs, Cube Building Process, Portal checks, Server Performance and check

Project Details # 3Company HSBC Software development Pvt. Ltd.Project FTP Fin-Pro Tools/Technologies Datastage 8.X, SQL, Unix, Citrix, IBM-Safr, Quality center

Description

FTP is an acronym of ‘Finance Transformation Platform’. FTP is a multi-components Integrated system that provides financial control with a common data repository for regulatory & management reporting. It’s deploying’ events based’ architecture. And it also creates a data warehouse to store the ‘reconciled detailed transactional level data to help in reporting.

Responsibilities

• Defined the Source databases and Target databases in Datastage and Created the Jobs and job sequences

• Environment setup for UIT Preparing Input files for self and different components

• Defect Tracking & solely responsible for Defect retesting Impact analysis on other modules.

• Performed SIT, Performance testing support.• Used most of the stages such as the Transformer,

Sequential file, Aggregator, Remove duplicate, Funnel, joiner, lookup etc

Project Details # 4Company Tridat Technology Pvt. Ltd. MumbaiProject OfficeMax Retailer SupplierTools/Technologies Cobol, JCL, Easytrive, SQL Spufi, Abend-Aid, XP-editor, Remedy, Db2

Description

OfficeMax is an office supplies retailer founded in 1988 and headquartered in Naperville. Create custom summary bill layout includes various logics on tax calculations and bill formats. It balance bill against database and send out the correct file.

Responsibilities

• Writing COBOL code from scratch for each and every Customer Layout.• Analyzing and Modification of Existing COBOL programs.• Testing of various components including System Testing,

Unit Testing and Integration testing of the products.• Train the new team members with the processes involved.

+91 9953507179 [email protected]

Educational Qualification

Educational Qualification Year of Completion Board/InstitutionB.E. 2009 Pune University

12th 2004 C.B.S.E.

10th 2002 Uttar Pradesh

Extra-Curriculum Activities Worked as a Datastage trainer Event Organizer in company and arranged BU event in company Received the excellence certification for various programs Highly interested in Internet surfing & searching out the new things Worked as event organization in college function Paper presentations in college Events

Personal Details

Name : Abhishek Gupta Father Name : Shri. Sunil Gupta Nationality : Indian Marital Status : Married Date of Birth : 02 Apr 1988. PAN card no. : ALBPG5564F Passport no. : Z2538735 Mobile Number : +91- 9953507179 Address : Gaur City, Noida Extension, U.P. Linguistic Proficiency : English, Hindi, Marathi

Declaration

I hereby declare that the particulars given herein are true to the best of my knowledge and belief.

Date – / /2016 Abhishek Gupta